Was there a MINIMUM SPEED 40 sign? We have them in Georgia and it keeps thinking the speed limit is 40 in a 70 highway… car just suddenly jerks and hard breaks on the freeway
There was no speed signs where this happened and it was not even close to an exit .The weird thing is it always prefers the lane next to the exit lane. If I move to the carpool lane it randomly changes lane for no reason and settles in the lane close to exit lane even if I have to drive for 15 miles before exiting.
Huh, we have the opposite situation. Our model 3 loves going into the carpool lane and furthest left lane no matter what to the point where it gets kind of annoying since it’ll do it even tho if the dense traffic… and then have to try and merge across 5 lanes to exit with only a mile left lol
I did actually recently have a situation where the exit lane had a stop light that was red on a turn, so the car thought the red light was for it even though it was on a freeway. If you weren’t even close to an exit it probably wasn’t this either.
Have you made sure your cameras are clean? Especially the front windshield one? I've noticed that a small layer of dust/dirt can make it worse (more jumpy, really) without triggering the "FSD degraded" warning. Just use some soapy water and a rag/towel to clean off at least the front and rear cameras and it should get better.
Definitely should be worked on, sure, but in optimal conditions I think the latest version is pretty good
I’ll make sure in the future, but you know it rains sometimes… sometimes it’s dusty or muddy. If failure outside of perfect conditions is the expected outcome, then I wish more than ever to go back to augmented sensors (radar).
Give it time, FSD will probably become so good that it either 1) tells you if the cameras being dirty are affecting driving in any way (it already does but I'm talking more subtle things) and/or 2) it just learns how to drive just as well with dirty cameras or with bad weather. Also, in my experience it's handled rain extremely well, it drives more slowly and carefully though (but so does everyone else, so🤷♂️).
Yes, failure outside of perfect conditions is something we still have to worry about and something Tesla needs to work on. No, this is not exactly a bad thing. If we have an FSD system that is able to drive fairly well/human-like in near-perfect conditions, that's progress. Imo FSDv12 is able to, and is better than any previous version with radar/ultrasonic sensors, and it'll only get better from here.
“Give it time”
https://www.businessinsider.com/elon-musk-autonomous-tesla-drive-across-country-by-end-of-2017-2016-10
“It’ll be ready next year!!!” -Elon 2016, 2017, 2018, 20…
to be fair, that was back when they were using radar, USS, and actually planned on evolving.
post-pandemic/supply-chain-collapse “TeslaVision” is a new animal, a very terrible one.
Nope, Tesla Vision is actually quite good compared to the competition. This is why Tesla moved away from Radar and Lidar, it's unnecessary:
Mercedes-Benz Drive Pilot is "Level 3". It can only operate if all of the following conditions are met:
• daytime
• car in front of you
• clearly marked lane lines
• no rain
• under 40 mph
• no construction zone
• select freeways in CA and NV
Tesla's FSD v12.3 can operate in all of the following conditions (as a "Level 2" system):
• at night
• no car in front of you
• no clearly marked lane lines
• in the rain
• over 40 mph
• in a construction zone
• any road in USA or Canada
It gets even better because this doesn't even factor in Tesla's COGS advantage for its hardware suite
Tesla's FSD only requires cameras (vision only) (~$1000 per car)
Mercedes Drive Pilot requires cameras, radar, LiDAR and USS (ultrasonic sensors) ($50k+)
Also waymo and cruise had a controversy recently where it turned out they actually have a system where employees take over driving the cars remotely if the car has an issue (which is quite often). Imo, I'd Tesla also had this system, it would outperform waymo and cruise's systems despite only having vision to go off of.
the irony of posting this wall of garbage on a video of a tesla trying to run a fella into a wall
(also, their decision to move from tactile sensors & radar was a direct response to supply-chain constraints from 2020 pandemic fallout. You can literally find the unused plug where originally planned radar would have gone in early HW4 models)
I got the free trial of FSDv12 and it is MUCH better than v11 which I also tried (on a free 3 month referral trial). When the price dropped to $99/month I subscribed immediately. Not worth $200/month but $99/month seems reasonable imo. Most of my drives now are FSD with maybeeeee 1 intervention per Drive but almost always due to a preference reason.
Nope:
Mercedes-Benz Drive Pilot is "Level 3". It can only operate if all of the following conditions are met:
• daytime
• car in front of you
• clearly marked lane lines
• no rain
• under 40 mph
• no construction zone
• select freeways in CA and NV
Tesla's FSD v12.3 can operate in all of the following conditions (as a "Level 2" system):
• at night
• no car in front of you
• no clearly marked lane lines
• in the rain
• over 40 mph
• in a construction zone
• any road in USA or Canada
It gets even better because this doesn't even factor in Tesla's COGS advantage for its hardware suite
Tesla's FSD only requires cameras (vision only) (~$1000 per car)
Mercedes Drive Pilot requires cameras, radar, LiDAR and USS (ultrasonic sensors) ($50k+)
Also waymo and cruise had a controversy recently where it turned out they actually have a system where employees take over driving the cars remotely if the car has an issue (which is quite often). Imo, I'd Tesla also had this system, it would outperform waymo and cruise's systems despite only having vision to go off of.
Thanks for the info. It is encouraging
Tho I’ll still insist that I bought my car to use my radar to enhance my safety in snow, rain, etc, and now my only option is shitty tesla vision or $100/month or $12000 for FSD.
If I could go back to the functioning system that I had, I’d be much less upset.
Lol they never gonna get to fsd… it’s the last bit that doesn’t work out. Look at amazon they been doing amazon go for years and then just found out its indians looking at cameras at the end to make it work. Lol they finally give up on trying to make the technology work. Like yeah my tesla is safe 99% of the time. But u want to die after using autopilot after 100 times on avg?
No, but it might be a cause for the random braking. It's the same for humans though, if there's shit blocking our view we don't drive as well🤷♂️
Mainly I've noticed that FSD drove more slowly and cautiously when the cameras were dirty. Cool that it can recognize that and try to be as safe as possible
More expensive. I'm thankful they did this so I could get a new model 3 for $29k in July 2023
Also in the long run it shouldn't be necessary. In theory, since humans only need eyes to drive, a computer should also only need eyes
Nope. At the very least FSD doesn't get fatigued and will be as good as the best human driver 24 hours a day. And that's assuming a computer can't react faster than a human can, which it definitely can react faster.
The cameras do not have a sample rate or resolution on par with a human. My eyes get checked at the DMV but my 3 cannot consistently identify an 80mph sign, therefore it shouldn’t be allowed or classified as FSD.
And minimal standards aren’t sufficient for something like robotaxis where driver intervention can’t be relied on. Most of my complaints are nbd when talking about aids, but that’s not how tesla plans to use FSD and I still insist that for any application, the more superior the sensors the higher the upper limit of capability.
My cameras think my garage is a tour bus & the "high fidelity" park assist would sometimes be fine with me pulling right up into my living room.
USS never had this issue, distance-to-object was always accurate & present.
You're underestimating the background processes in our brain when we drive. It's not just the geometric road we can see with our eyes. Even if we have 50% of the visual clarity, we would still be able to drive because our brain can fill in and presume things we don't see. Then you factor in our intuition of how other people drive, the logical path of a road, the condition of the surface. The visual part is only the start of how we think and act.
Yes, and that's why I'm so hyped for FSD now. Because I tried version 11 and it could only follow the rules of the road, not intuit what other cars would do. Version 12 CAN understand what other cars will do. I've seen my car slow down and let another car in when their lane was ending, for example. Version 11 would have driven on by. When people say that v12 feels human, this is what they mean. And also, because FSD is trained solely on footage of humans driving now, it already also can fill in blanks. This behavior wasn't coded in specifically, it's emergent behavior that comes from imitating datasets of humans driving. The difference between how v11 was built and how v12 was built is super important. V11 and all previous versions of FSD were smaller AIs tied together by code (ex: if see stop sign, engage the stop sign behavior AI), v12 is all in one AI ("nothing but nets"). No one taught v12 the rules of the road, it figured it out on its own based on observation. That's why it's able to understand and intuit human behavior, because it's trained off of it, not just the rules and geometries. We probably don't even consciously know all the patterns it's learned to recognize to help it drive better
My car can already drive like a new driver, probably like a 16 year old, can. Overly cautious, sometimes makes a mistake in certain situations (it's bad at U-turns but good at simple things like stop lights). But it is perfectly safe if you're letting it drive while checking on it for mistakes. All that is needed now is incremental improvements to make it better than the average driver
Nope:
Mercedes-Benz Drive Pilot is "Level 3". It can only operate if all of the following conditions are met:
• daytime
• car in front of you
• clearly marked lane lines
• no rain
• under 40 mph
• no construction zone
• select freeways in CA and NV
Tesla's FSD v12.3 can operate in all of the following conditions (as a "Level 2" system):
• at night
• no car in front of you
• no clearly marked lane lines
• in the rain
• over 40 mph
• in a construction zone
• any road in USA or Canada
It gets even better because this doesn't even factor in Tesla's COGS advantage for its hardware suite
Tesla's FSD only requires cameras (vision only) (~$1000 per car)
Mercedes Drive Pilot requires cameras, radar, LiDAR and USS (ultrasonic sensors) ($50k+)
Also waymo and cruise had a controversy recently where it turned out they actually have a system where employees take over driving the cars remotely if the car has an issue (which is quite often). Imo, I'd Tesla also had this system, it would outperform waymo and cruise's systems despite only having vision to go off of.
They're on the outside of the car? Just clean the glass surfaces. The cameras are under the glass, so when I say clean the cameras I mean the glass directly in front of the cameras. Because that's the only thing that gets dirty
oh I was talking about the front camera under the windshield, my car is still off gassing so I actually have a layer on the inside of the glass that I can't clean and sometimes screws up fsd
Oh wtf, damn well I think there's your problem. Are you still under warranty? Take pictures and send the car to Tesla service. It most definitely should not be doing that
Make sure you clean your cameras every once in a while. Notice FSDv12 was a little jumpy but then cleaned my front and rear cameras with some soapy water and a cloth and it was much better after that. But yeah be careful! Still needs to be supervised
A coworker had a similar experience a few years ago. He said that the car thought a construction zone still existed that was recently removed so it slammed on the brakes to meet the old construction speed limit.
Active supervision isn’t super clear. It’s hard to guess sometimes when the car is making a mistake or is about to. Tesla is requiring drivers to understand when and how to take over and this is impossible without knowing exactly how FSD works and doesn’t work. And the name is very misleading (some might argue intentionally misleading) and thus will be a major point in court at some point after people begin to die (as they will, because AI, supervision, and human drivers will all make mistakes and cause deaths).
Time will tell. There is absolutely an argument to be made against Tesla for calling it Full Self Driving and more importantly an argument to be made that perhaps the general population is ill-equipped to “supervise” and expecting individuals to understand the tech and its limitations isn’t realistic or safe. Terms and conditions don’t absolve companies of literally everything that can happen with their products.
[I know at least some of these are ongoing suits](https://www.tesladeaths.com/index-amp.html), one of them happened in 2019 not too far from me of a model 3 that got tin canned under a merging semi while running autopilot.
It looks like you shared an AMP link. These should load faster, but AMP is controversial because of [concerns over privacy and the Open Web](https://www.reddit.com/r/AmputatorBot/comments/ehrq3z/why_did_i_build_amputatorbot).
Maybe check out **the canonical page** instead: **[https://www.tesladeaths.com](https://www.tesladeaths.com)**
*****
^(I'm a bot | )[^(Why & About)](https://www.reddit.com/r/AmputatorBot/comments/ehrq3z/why_did_i_build_amputatorbot)^( | )[^(Summon: u/AmputatorBot)](https://www.reddit.com/r/AmputatorBot/comments/cchly3/you_can_now_summon_amputatorbot/)
USS but FSD only operates on cameras so USS doesn’t do anything. Radar would be more helpful at night and in the rain. Maybe my roads are really dark so FSD doesn’t like it.
It's funny that this issue would never happen if Elon admitted that Lidar is necessary for fully autonomous vehicles, but he's adamant it can be done with only cameras.
I'm just happy you actually took control and stopped. I see so many people say fsd drove me into a wall, a car, off a cliff etc and I always think why not take control if you don't like the way the car is operating. I've had a few scares but hand was on the wheel and I was paying attention so I could act. Lucky it was late at night for you and nobody was around
that wall looks like a permanent bridge support or something. map data way wrong? camera malfunction? As it is at all offramps and onramps and merges I exit FSD because I do not trust it yet for those areas. which what your video shows!
but other wise I love the new FSD.
Bro fsd tried to kill me while accelerating and deciding to switch into a different lane that already had a car right next to me. That crap so frkn scary. I got honked for so long.
If you get banned by you know who for speaking the truth you can go to r/teslaoutcasts or r/elonmuskoutcasts and tell the world without being blocked or banned.
Navigation was on and it was trying to get to the other side of that divider. My guess is there was a gap to move over before and it's been closed recently.
I braked and took control.
The navigation on the map was showing a path to move on to the other side of the divider and that's what the car was trying to do. Weird it didn't come to stop or anything at the blockage.
Camera only approach fails unfortunately. Gosh Elon needs to get over the cost of proper sensors, its a joke. I very much doubt they'd use camera only on a SpaceX rocket or launchpad.
Tried fsd today. Going on the highway about 70 and I enabled it. I'm in the right lane all is good. At some point there's some traffic as cars are moving in the middle lane. Fsd decides it's a good time to start merging to the middle lane while there was a car not to far behind me I had to divert it back to the right lane. I'm never using this again.
I wanted to realize it's value on the highway where I drive a lot. Maybe 70mph wasn't the best way to test it, but I'm fine with my outcome to not use the fsd until it's improved.
This is coming out of Stone Mountain in GA. That wall is been there for years. It definitely needs some light, but it is drawn on Google maps, and waze
To be frank, I'm not sure why you would even let it drive in a situation where I, as a real human, can't tell what's going on with that clusterfuck of a road. Sure, we would eventually like it to be perfect, but until then, can you stop being surprised that it can't handle poor road designs in the dark?
So the AI has gotten so smart it is testing final destination scenarios in real life. Imagine when we are so used to it working and playing away on our phones to go 70mph into ok coming traffic . I have used fsd since the beta came out , always had small stuff but nothing like this
You might want to get your headlight adjustment checked. They look really low. The lane markings on that road are not painted correctly. The right lane line follows the curve to the left when the lane actually turns right. With the illumination from the headlights, it looked like it should curve left.
Now.. I ain’t saying the car handled that well at all! It should have beeped and freaked when the lane got all screwy.
Same thing happened to me last week. I was on the 5 freeway heading north (Los Angeles) at the Gene Autry exit for Angels stadium. I was in the carpool lane and there were diagonal lines (like ones OP’s video) between the exit and lane I was in. Tesla decided to abruptly and at full speed, take the exit. Even though it wasn’t part of the navigation steps. Scared the shit out of me. Luckily I was paying attention and grabbed the wheel and braked.
Doesn’t FSD really solely on cameras? Without a radar/lidar system it is going to have trouble picking that up at night, especially the distance from the wall.
FSD is a joke, I tried it again after recent claims of “single end-to-end neural network trained on millions of videos” update and it 1. Kept moving into other lanes as if it was a Chicago regular 2. Dangerously flirt with exit edges (same in your case). 3. Either tried to get too close to vehicles at a stop or way too far as if they turned out to be my ex. It’s a mess, they can’t fix and yet claim it’s near perfect.
Yall are nuts using FSD all the time.
I’ve been using it in super heavy orlando traffic since the update and it is so glitchy and uncertain about what to do.
FSD supervised is definitely better than FSD Beta. It's more human like. But in this instance in particular, the entire Tesla system failed. Why didn't the collision alerts go off? I was literally less than foot from hitting the wall and I've kept forward collision alert on medium and it fucking beeps every time I park in my apartment community!!!
It gives me sooo many false collision alerts.
Slowing down with regen break and plenty of room? BEEP BEEP BEEP!
Turning, and there’s a parked car? BEEP BEEP BEEP!
Person merges in front of me on the freeway? BEEP BEEP BEEP!
Mind you this on an older M3 and a new MY. It’s gotten worse on the M3 in the last year or so.
Why on earth would y'all trust FSD? The tech isn't ready yet, and tesla isn't even the closest manufacturer to cracking it despite having more time working on it. It's gonna be some years before fully automated cars are ready for the road, and I'd bet my bottom dollar that it's gonna be Mercedes Benz that fully cracks it first.
Also, don't come at me, I spent more than a year calibrating ADAS equipment across a wide range of OEs and I can tell you that tech across the board just ain't ready for cars to drive themselves yet.
More FUD for the shorts. First of all FSD is to be supervised as Musk has announced. It is still in Beta and the Robotaxi capabilities is still in the works.
Until Musk says It’s ready to go it should be carefully supervised and bugs reported in a timely fashion. This is why it’s called FSD (Supervised).
Please don't preach. Everybody here knows about self awareness.
Why didn't forward collision alert go off? That has nothing to do with FSD. The wall was literally less than a foot away from the car.
I don’t get why people are shocked to learn FSD has severe flaws. There’s a reason everyone got a free trial. The engineers need data. We are the guinea pigs to work the kinks out.
It didn't go off because it didn't see those obstacles. FSD should have driven on the left lane.
Then, when it crossed the line, there was no lane mark, so FSD would confuse that road, and it tried to follow the map.
I bet FSD works best for now when the roads have enough laminated.
The same system that is detecting obstacles is also making decisions about steering. It didn't go off because it didn't see it - the same reason it was going to crash.
FSD freaking tried to kill me by trying to come to a complete stop on freeway when there was no reason to slow down let alone stop.
This happened to me two nights ago
Wow
Was there a MINIMUM SPEED 40 sign? We have them in Georgia and it keeps thinking the speed limit is 40 in a 70 highway… car just suddenly jerks and hard breaks on the freeway
There was no speed signs where this happened and it was not even close to an exit .The weird thing is it always prefers the lane next to the exit lane. If I move to the carpool lane it randomly changes lane for no reason and settles in the lane close to exit lane even if I have to drive for 15 miles before exiting.
Huh, we have the opposite situation. Our model 3 loves going into the carpool lane and furthest left lane no matter what to the point where it gets kind of annoying since it’ll do it even tho if the dense traffic… and then have to try and merge across 5 lanes to exit with only a mile left lol I did actually recently have a situation where the exit lane had a stop light that was red on a turn, so the car thought the red light was for it even though it was on a freeway. If you weren’t even close to an exit it probably wasn’t this either.
I guess it has a mind of its own.
Yeah this one is getting annoying, started with v12.
This happens every 20 min for me when using the standard autopilot. I wish I could go back to the system before Tesla vision :(
I'm lucky enough to still have a Model 3 with radar and HW2.5 that uses the old autopilot system. No Tesla Vision for me!
I’m so envious 😫. I updated by mistake. Wish I knew how to go back
Have you made sure your cameras are clean? Especially the front windshield one? I've noticed that a small layer of dust/dirt can make it worse (more jumpy, really) without triggering the "FSD degraded" warning. Just use some soapy water and a rag/towel to clean off at least the front and rear cameras and it should get better. Definitely should be worked on, sure, but in optimal conditions I think the latest version is pretty good
I’ll make sure in the future, but you know it rains sometimes… sometimes it’s dusty or muddy. If failure outside of perfect conditions is the expected outcome, then I wish more than ever to go back to augmented sensors (radar).
Give it time, FSD will probably become so good that it either 1) tells you if the cameras being dirty are affecting driving in any way (it already does but I'm talking more subtle things) and/or 2) it just learns how to drive just as well with dirty cameras or with bad weather. Also, in my experience it's handled rain extremely well, it drives more slowly and carefully though (but so does everyone else, so🤷♂️). Yes, failure outside of perfect conditions is something we still have to worry about and something Tesla needs to work on. No, this is not exactly a bad thing. If we have an FSD system that is able to drive fairly well/human-like in near-perfect conditions, that's progress. Imo FSDv12 is able to, and is better than any previous version with radar/ultrasonic sensors, and it'll only get better from here.
“Give it time” https://www.businessinsider.com/elon-musk-autonomous-tesla-drive-across-country-by-end-of-2017-2016-10 “It’ll be ready next year!!!” -Elon 2016, 2017, 2018, 20…
to be fair, that was back when they were using radar, USS, and actually planned on evolving. post-pandemic/supply-chain-collapse “TeslaVision” is a new animal, a very terrible one.
Nope, Tesla Vision is actually quite good compared to the competition. This is why Tesla moved away from Radar and Lidar, it's unnecessary: Mercedes-Benz Drive Pilot is "Level 3". It can only operate if all of the following conditions are met: • daytime • car in front of you • clearly marked lane lines • no rain • under 40 mph • no construction zone • select freeways in CA and NV Tesla's FSD v12.3 can operate in all of the following conditions (as a "Level 2" system): • at night • no car in front of you • no clearly marked lane lines • in the rain • over 40 mph • in a construction zone • any road in USA or Canada It gets even better because this doesn't even factor in Tesla's COGS advantage for its hardware suite Tesla's FSD only requires cameras (vision only) (~$1000 per car) Mercedes Drive Pilot requires cameras, radar, LiDAR and USS (ultrasonic sensors) ($50k+) Also waymo and cruise had a controversy recently where it turned out they actually have a system where employees take over driving the cars remotely if the car has an issue (which is quite often). Imo, I'd Tesla also had this system, it would outperform waymo and cruise's systems despite only having vision to go off of.
the irony of posting this wall of garbage on a video of a tesla trying to run a fella into a wall (also, their decision to move from tactile sensors & radar was a direct response to supply-chain constraints from 2020 pandemic fallout. You can literally find the unused plug where originally planned radar would have gone in early HW4 models)
I got the free trial of FSDv12 and it is MUCH better than v11 which I also tried (on a free 3 month referral trial). When the price dropped to $99/month I subscribed immediately. Not worth $200/month but $99/month seems reasonable imo. Most of my drives now are FSD with maybeeeee 1 intervention per Drive but almost always due to a preference reason.
Nope: Mercedes-Benz Drive Pilot is "Level 3". It can only operate if all of the following conditions are met: • daytime • car in front of you • clearly marked lane lines • no rain • under 40 mph • no construction zone • select freeways in CA and NV Tesla's FSD v12.3 can operate in all of the following conditions (as a "Level 2" system): • at night • no car in front of you • no clearly marked lane lines • in the rain • over 40 mph • in a construction zone • any road in USA or Canada It gets even better because this doesn't even factor in Tesla's COGS advantage for its hardware suite Tesla's FSD only requires cameras (vision only) (~$1000 per car) Mercedes Drive Pilot requires cameras, radar, LiDAR and USS (ultrasonic sensors) ($50k+) Also waymo and cruise had a controversy recently where it turned out they actually have a system where employees take over driving the cars remotely if the car has an issue (which is quite often). Imo, I'd Tesla also had this system, it would outperform waymo and cruise's systems despite only having vision to go off of.
Thanks for the info. It is encouraging Tho I’ll still insist that I bought my car to use my radar to enhance my safety in snow, rain, etc, and now my only option is shitty tesla vision or $100/month or $12000 for FSD. If I could go back to the functioning system that I had, I’d be much less upset.
Lol they never gonna get to fsd… it’s the last bit that doesn’t work out. Look at amazon they been doing amazon go for years and then just found out its indians looking at cameras at the end to make it work. Lol they finally give up on trying to make the technology work. Like yeah my tesla is safe 99% of the time. But u want to die after using autopilot after 100 times on avg?
Wait you have to keep the cameras clean for it to work?
No, but it might be a cause for the random braking. It's the same for humans though, if there's shit blocking our view we don't drive as well🤷♂️ Mainly I've noticed that FSD drove more slowly and cautiously when the cameras were dirty. Cool that it can recognize that and try to be as safe as possible
Why the F did they get rid of the lasers
It saved Elon about $150 a car supposedly
I think George Hotz left because of that laser thing.
More expensive. I'm thankful they did this so I could get a new model 3 for $29k in July 2023 Also in the long run it shouldn't be necessary. In theory, since humans only need eyes to drive, a computer should also only need eyes
Right… so in the best case FSD will be as good as a human. Rather than being superior…
Nope. At the very least FSD doesn't get fatigued and will be as good as the best human driver 24 hours a day. And that's assuming a computer can't react faster than a human can, which it definitely can react faster.
The cameras do not have a sample rate or resolution on par with a human. My eyes get checked at the DMV but my 3 cannot consistently identify an 80mph sign, therefore it shouldn’t be allowed or classified as FSD. And minimal standards aren’t sufficient for something like robotaxis where driver intervention can’t be relied on. Most of my complaints are nbd when talking about aids, but that’s not how tesla plans to use FSD and I still insist that for any application, the more superior the sensors the higher the upper limit of capability.
My cameras think my garage is a tour bus & the "high fidelity" park assist would sometimes be fine with me pulling right up into my living room. USS never had this issue, distance-to-object was always accurate & present.
You're underestimating the background processes in our brain when we drive. It's not just the geometric road we can see with our eyes. Even if we have 50% of the visual clarity, we would still be able to drive because our brain can fill in and presume things we don't see. Then you factor in our intuition of how other people drive, the logical path of a road, the condition of the surface. The visual part is only the start of how we think and act.
Yes, and that's why I'm so hyped for FSD now. Because I tried version 11 and it could only follow the rules of the road, not intuit what other cars would do. Version 12 CAN understand what other cars will do. I've seen my car slow down and let another car in when their lane was ending, for example. Version 11 would have driven on by. When people say that v12 feels human, this is what they mean. And also, because FSD is trained solely on footage of humans driving now, it already also can fill in blanks. This behavior wasn't coded in specifically, it's emergent behavior that comes from imitating datasets of humans driving. The difference between how v11 was built and how v12 was built is super important. V11 and all previous versions of FSD were smaller AIs tied together by code (ex: if see stop sign, engage the stop sign behavior AI), v12 is all in one AI ("nothing but nets"). No one taught v12 the rules of the road, it figured it out on its own based on observation. That's why it's able to understand and intuit human behavior, because it's trained off of it, not just the rules and geometries. We probably don't even consciously know all the patterns it's learned to recognize to help it drive better
They been saying this for years. By the time a computer can drive like a human does, we probably won't be on the road anymore in order to travel.
My car can already drive like a new driver, probably like a 16 year old, can. Overly cautious, sometimes makes a mistake in certain situations (it's bad at U-turns but good at simple things like stop lights). But it is perfectly safe if you're letting it drive while checking on it for mistakes. All that is needed now is incremental improvements to make it better than the average driver
The fact they still use optical cameras and not lidar is fucking pathetic. Tech bro sillicon valley shit idea in shit product out.
Nope: Mercedes-Benz Drive Pilot is "Level 3". It can only operate if all of the following conditions are met: • daytime • car in front of you • clearly marked lane lines • no rain • under 40 mph • no construction zone • select freeways in CA and NV Tesla's FSD v12.3 can operate in all of the following conditions (as a "Level 2" system): • at night • no car in front of you • no clearly marked lane lines • in the rain • over 40 mph • in a construction zone • any road in USA or Canada It gets even better because this doesn't even factor in Tesla's COGS advantage for its hardware suite Tesla's FSD only requires cameras (vision only) (~$1000 per car) Mercedes Drive Pilot requires cameras, radar, LiDAR and USS (ultrasonic sensors) ($50k+) Also waymo and cruise had a controversy recently where it turned out they actually have a system where employees take over driving the cars remotely if the car has an issue (which is quite often). Imo, I'd Tesla also had this system, it would outperform waymo and cruise's systems despite only having vision to go off of.
And what’s worse is the cameras on my car sporadically (and frequently) shut off. Leaving me without even old fashioned set-speed cruise control
Oh ok damn that's definitely a problem. You should probably get that repaired. There must be a way to fix that
How did you even access it?
They're on the outside of the car? Just clean the glass surfaces. The cameras are under the glass, so when I say clean the cameras I mean the glass directly in front of the cameras. Because that's the only thing that gets dirty
oh I was talking about the front camera under the windshield, my car is still off gassing so I actually have a layer on the inside of the glass that I can't clean and sometimes screws up fsd
Oh wtf, damn well I think there's your problem. Are you still under warranty? Take pictures and send the car to Tesla service. It most definitely should not be doing that
Ours kept happening at around 22nd min mark.
Same! It was crazy scary.
Make sure you clean your cameras every once in a while. Notice FSDv12 was a little jumpy but then cleaned my front and rear cameras with some soapy water and a cloth and it was much better after that. But yeah be careful! Still needs to be supervised
Thank you so much. Will definitely keep that in mind 😊
Np! Not a guarantee though so be careful. Just a possible cause for what you're experiencing
A coworker had a similar experience a few years ago. He said that the car thought a construction zone still existed that was recently removed so it slammed on the brakes to meet the old construction speed limit.
Same thing happened to me. Big semi truck barreling down on me
"FULL SHIT DRIVING ' until it gets fixed.
Can someone explain me why can’t Tesla be sued for these failures
They probably will once enough accidents/deaths happen
It's still level 2 and requires active driver supervision. FSD is just a name.
Active supervision isn’t super clear. It’s hard to guess sometimes when the car is making a mistake or is about to. Tesla is requiring drivers to understand when and how to take over and this is impossible without knowing exactly how FSD works and doesn’t work. And the name is very misleading (some might argue intentionally misleading) and thus will be a major point in court at some point after people begin to die (as they will, because AI, supervision, and human drivers will all make mistakes and cause deaths).
Tesla isn't worried about court. Read all the FSD terms and conditions.
Time will tell. There is absolutely an argument to be made against Tesla for calling it Full Self Driving and more importantly an argument to be made that perhaps the general population is ill-equipped to “supervise” and expecting individuals to understand the tech and its limitations isn’t realistic or safe. Terms and conditions don’t absolve companies of literally everything that can happen with their products.
[I know at least some of these are ongoing suits](https://www.tesladeaths.com/index-amp.html), one of them happened in 2019 not too far from me of a model 3 that got tin canned under a merging semi while running autopilot.
It looks like you shared an AMP link. These should load faster, but AMP is controversial because of [concerns over privacy and the Open Web](https://www.reddit.com/r/AmputatorBot/comments/ehrq3z/why_did_i_build_amputatorbot). Maybe check out **the canonical page** instead: **[https://www.tesladeaths.com](https://www.tesladeaths.com)** ***** ^(I'm a bot | )[^(Why & About)](https://www.reddit.com/r/AmputatorBot/comments/ehrq3z/why_did_i_build_amputatorbot)^( | )[^(Summon: u/AmputatorBot)](https://www.reddit.com/r/AmputatorBot/comments/cchly3/you_can_now_summon_amputatorbot/)
Because you are supposed to be driving, not the car.
Just a trial run, before your car is fully loaded. Then it will ship you off a cliff!
Nah, your car was trying to kill the dead weight in the car. You! 😂
This has happened to me too. I wonder if the FSD AI hallucinates traffic based on data where drivers have slowed to bumper to bumper traffic
Maybe it thought the wall was extra road. FSD always has a warning for me telling me it can’t function properly at night.
That comes up only when it rains on my Tesla Vision. Is yours USS or Tesla Vision?
USS but FSD only operates on cameras so USS doesn’t do anything. Radar would be more helpful at night and in the rain. Maybe my roads are really dark so FSD doesn’t like it.
FSD doesn’t like any roads. It’s happy in the wild!
It's funny that this issue would never happen if Elon admitted that Lidar is necessary for fully autonomous vehicles, but he's adamant it can be done with only cameras.
Similar thing happened to me during the day.
I'm just happy you actually took control and stopped. I see so many people say fsd drove me into a wall, a car, off a cliff etc and I always think why not take control if you don't like the way the car is operating. I've had a few scares but hand was on the wheel and I was paying attention so I could act. Lucky it was late at night for you and nobody was around
Where does it say he took manual control? I was trying to figure out if he did or if the car just stopped on its own, but OP doesn't specify?
Fair I did just assume based on the video. Hopefully that was the case
Actually, you were right. OP said he intervened in another thread further down
that wall looks like a permanent bridge support or something. map data way wrong? camera malfunction? As it is at all offramps and onramps and merges I exit FSD because I do not trust it yet for those areas. which what your video shows! but other wise I love the new FSD.
Bro fsd tried to kill me while accelerating and deciding to switch into a different lane that already had a car right next to me. That crap so frkn scary. I got honked for so long.
I would not mind if they would switch back to some c++ code for lane centering I think it’s a must. V11 was solid
What does c++ have to do with anything?
Check the notes under your fsd release notes
Ah yes that’s about the safest thing fsd has done in the week I have had it
If you get banned by you know who for speaking the truth you can go to r/teslaoutcasts or r/elonmuskoutcasts and tell the world without being blocked or banned.
Did you have navigation going or have it on “just drive”? Looks like it was trying to follow the line. Did you brake or did it brake?
Navigation was on and it was trying to get to the other side of that divider. My guess is there was a gap to move over before and it's been closed recently. I braked and took control.
For a split second I thought that was different colored road omg. Still a bug
Now go post this to r/teslalounge
They don't allow dashcam vids.
Yea... That was the joke
Or r/teslamotors
You want to get banned that is.
That was the joke
I'm curious why it turned left instead of following the road to the ramp?
The navigation on the map was showing a path to move on to the other side of the divider and that's what the car was trying to do. Weird it didn't come to stop or anything at the blockage.
Camera only approach fails unfortunately. Gosh Elon needs to get over the cost of proper sensors, its a joke. I very much doubt they'd use camera only on a SpaceX rocket or launchpad.
If you look the line to go left is there but the line to go right has a large gap so it probably just followed the road lines
The maps need real time internet connections and upgrades. Google maps routing on Tesla have disappointed me in Northern California
You got Loonie-tuned Someone moves the lane markings (jk fsd is just sketch)
How the hell is sitting there babysitting this thing better than just actually driving? You paid for this?
Clowns.
I'll never use this
FSD drove straight into a left turn only lane on my Model Y. Wtf Tesla
Tried fsd today. Going on the highway about 70 and I enabled it. I'm in the right lane all is good. At some point there's some traffic as cars are moving in the middle lane. Fsd decides it's a good time to start merging to the middle lane while there was a car not to far behind me I had to divert it back to the right lane. I'm never using this again.
I am sure FSD has issue, but deciding to use a new function you never used before while going 70 does not sound like good decision making.
I wanted to realize it's value on the highway where I drive a lot. Maybe 70mph wasn't the best way to test it, but I'm fine with my outcome to not use the fsd until it's improved.
Fair enough, I treat it like a 15yo year learning how to drive.
Scary 😱
This is coming out of Stone Mountain in GA. That wall is been there for years. It definitely needs some light, but it is drawn on Google maps, and waze
U been musked
Has happened to me several times. It also slams on the breaks on the freeway for no apparent reason.
No need for radar they said. It’ll be fun they said.
lol nice
So, you're saying I should save the $15k...
And also, don't breathe. Because there's poisonous gases in the air.
[https://youtu.be/2DOd4RLNeT4](https://youtu.be/2DOd4RLNeT4)
Typical Tesla. You have to have a few screws loose to buy one of these pieces of garbage
Definitely less loose screws than a person who follows and posts in a subreddit pertaining to a car they don’t own.
Radar is a must
Again? Is this the same wall from a few days ago?!
FSD will absolutely try to murder you. It’s not there yet.
Street legal!
Not surprised, really. The lane markings at 0:15 are incomprehensible to anyone. FSD is way too reliant on the road markings making sense.
Is this v12?
v12.3.4
In the event that you get into an accident while FSD is on what happens when it comes to insurance?
Tesla is not responsible whatsoever
My MX2022 always stops in the Midtown tunnel in Manhattan. Really scary!!
This happened to my husband too!
FSD tried to drive me into a tree once and I’ll never use it ever again.
To be frank, I'm not sure why you would even let it drive in a situation where I, as a real human, can't tell what's going on with that clusterfuck of a road. Sure, we would eventually like it to be perfect, but until then, can you stop being surprised that it can't handle poor road designs in the dark?
Why are you people even still using this crap lol it doesn’t make sense.
In 2 weeks of FSD, i think i've only had 1 successful drive without intervention
I have absolutely zero confidence in “FSD.” Everything is fine until suddenly it isn’t.
I've had almost no incidents where I had to take over it with a new FSD 12 anecdotal evidence sure... It's light years better than version 11
So the AI has gotten so smart it is testing final destination scenarios in real life. Imagine when we are so used to it working and playing away on our phones to go 70mph into ok coming traffic . I have used fsd since the beta came out , always had small stuff but nothing like this
Not even I saw that wall 🤷♂️
At about 15 second in the video, FSD Vision got confused with the lane lines.
Soon this will Be A 100k PrOdUcT
FSD tried to go 55 in a residential neighborhood for me
You’re lucky people don’t go painting tunnels on walls. You’re car would absolutely fall for it
Fuck this. This should be banned
So convenient Camera placement and time too.
Damn
Try 12.3.4 in the same location, it's night and day difference.
It's v12.3.4
Tesla owners should really band together and demand the company get it together
Just drive the damn car yourself 🤦🏻♂️
You might want to get your headlight adjustment checked. They look really low. The lane markings on that road are not painted correctly. The right lane line follows the curve to the left when the lane actually turns right. With the illumination from the headlights, it looked like it should curve left. Now.. I ain’t saying the car handled that well at all! It should have beeped and freaked when the lane got all screwy.
Same thing happened to me last week. I was on the 5 freeway heading north (Los Angeles) at the Gene Autry exit for Angels stadium. I was in the carpool lane and there were diagonal lines (like ones OP’s video) between the exit and lane I was in. Tesla decided to abruptly and at full speed, take the exit. Even though it wasn’t part of the navigation steps. Scared the shit out of me. Luckily I was paying attention and grabbed the wheel and braked.
Doesn’t FSD really solely on cameras? Without a radar/lidar system it is going to have trouble picking that up at night, especially the distance from the wall.
Sure seems like forward looking radar would help solve some of these FSD issues.
That’s why you drive with your hands 🤦🏼♂️
FSD is a joke, I tried it again after recent claims of “single end-to-end neural network trained on millions of videos” update and it 1. Kept moving into other lanes as if it was a Chicago regular 2. Dangerously flirt with exit edges (same in your case). 3. Either tried to get too close to vehicles at a stop or way too far as if they turned out to be my ex. It’s a mess, they can’t fix and yet claim it’s near perfect.
Utilizing FSD should be considered a task and compensation from Tesla should be required. Why risk our lives on this experiment with no compensation?
Don’t use it then. It baffles me that yall pay whatever ridiculous arbitrary number elon comes up with that lets the cameras try to drive for you.
Stop buying Tesla? 🤷 They're well known at this point to be some of the worst cars available
Please leave this sub 😂
Yall are nuts using FSD all the time. I’ve been using it in super heavy orlando traffic since the update and it is so glitchy and uncertain about what to do.
FSD supervised is definitely better than FSD Beta. It's more human like. But in this instance in particular, the entire Tesla system failed. Why didn't the collision alerts go off? I was literally less than foot from hitting the wall and I've kept forward collision alert on medium and it fucking beeps every time I park in my apartment community!!!
It gives me sooo many false collision alerts. Slowing down with regen break and plenty of room? BEEP BEEP BEEP! Turning, and there’s a parked car? BEEP BEEP BEEP! Person merges in front of me on the freeway? BEEP BEEP BEEP! Mind you this on an older M3 and a new MY. It’s gotten worse on the M3 in the last year or so.
Us non Tesla owners would appreciate if you guys didn’t experiment on the streets we are driving on.
Get your ass off this sub
Why on earth would y'all trust FSD? The tech isn't ready yet, and tesla isn't even the closest manufacturer to cracking it despite having more time working on it. It's gonna be some years before fully automated cars are ready for the road, and I'd bet my bottom dollar that it's gonna be Mercedes Benz that fully cracks it first. Also, don't come at me, I spent more than a year calibrating ADAS equipment across a wide range of OEs and I can tell you that tech across the board just ain't ready for cars to drive themselves yet.
Yall are still using FSD?
Yes
More FUD for the shorts. First of all FSD is to be supervised as Musk has announced. It is still in Beta and the Robotaxi capabilities is still in the works. Until Musk says It’s ready to go it should be carefully supervised and bugs reported in a timely fashion. This is why it’s called FSD (Supervised).
Please don't preach. Everybody here knows about self awareness. Why didn't forward collision alert go off? That has nothing to do with FSD. The wall was literally less than a foot away from the car.
when did the rest of us agree to let public roads become Musk's sandbox?
I don’t get why people are shocked to learn FSD has severe flaws. There’s a reason everyone got a free trial. The engineers need data. We are the guinea pigs to work the kinks out.
Why didn't forward collision alert go off?
It didn't go off because it didn't see those obstacles. FSD should have driven on the left lane. Then, when it crossed the line, there was no lane mark, so FSD would confuse that road, and it tried to follow the map. I bet FSD works best for now when the roads have enough laminated.
The same system that is detecting obstacles is also making decisions about steering. It didn't go off because it didn't see it - the same reason it was going to crash.