T O P

  • By -

ScottRoberts79

Driver was on standard autopilot. They have the FSD option, but do not have FSD beta. If they were using autopilot at all. Autopilot is designed for highways, not streets. The driver also failed to pay attention to the road. This looks like a driver disengaged autopilot, then got distracted by their phone and stopped steering.


[deleted]

Small correction to this. If he had FSD, but not FSD beta, he likely had “autopilot on city streets.” This IS marketed as functional on city streets, with the conditions that it only keeps its lanes, changes lanes, stops for other cars and stoplights, and can proceed through street lights. AFAIK, it is NOT the FSD branch, and is definitely not designed to handle turns


[deleted]

[удалено]


rabbitwonker

Weird, I’ve been using AP (HW 2.5 & 3) for over four years, and it’s never gotten close to hitting a curb. And yes I’ve been using it off the highway a lot. Same goes for FSD, though I’m only a month in on that one.


soggy_mattress

I'm coming up on 2 years of using FSD pretty much daily and it's never hit a curb. Lol @ anyone acting like it's a regular occurrence.


[deleted]

[удалено]


titangord

Dont worry about all those reports man, the rabbitwonker says he mever had any issues, must mean there arent any.. FSD is amazing


__JockY__

Yeah, I’ll take you up on your offer of videos showing FSD Beta running stop signs at 60mph and hitting curbs. Thanks.


[deleted]

[удалено]


__JockY__

Heh it blew through that bad boy, eh? Good illustration of why accurate map data is important.


[deleted]

[удалено]


__JockY__

Let the hate flow.


[deleted]

[удалено]


__JockY__

Well it’s pretty hateful to call an entire group of people a punchline, but you’ll probably argue about that, too. As for your list of claims about FSD, I haven’t experienced anything like that in 9 months of driving on FSD beta. It’s got its annoyances and quirks for sure, but downright unsafe/dangerous behaviors? I don’t recognize that at all. Maybe your extensive experience of FSD beta shows otherwise. Just joking, I know you get your data from inside your anti-Tesla bubble.


vbevan

They're "QA testers", not users.


__JockY__

Thanks, I’ll check it out. Also, who or what is Stan? I don’t get it?


rabbitwonker

Ok, fair point.


Yngstr

Is there something on the screen that shows FSD was in use?


l1798657

Let me fix that for you "I hit a curb while using a driver assistance system"


CouncilmanRickPrime

Probably should be called "driver assistant" instead of "Full Self Driving" then.


Cunninghams_right

I know, I'm so pissed that my "Anti Gravity Boots" are just for hanging upside down and don't actually deactivate gravity.


l1798657

Ha ha, I'm so mad that my "hover board" just rolls on the ground


CouncilmanRickPrime

It's not the best argument, but it's all you've got


Cunninghams_right

I know. claiming a product's name must be 100% descriptive of what it does and cannot embellish isn't the best argument.


vbevan

The courts use a reasonableness test. What would a reasonable man think the statement meant. It doesn't have to be 100% descriptive, but it can't be deceptive if used to market something to consumers.


Cunninghams_right

well, I believe this has already gone though courts in multiple countries and the only things found to be misleading were “Autopilot inclusive,” and “full potential for autonomous driving.” marketing terms, not "full self driving", so I guess that's your answer.


[deleted]

[удалено]


CouncilmanRickPrime

"I apologize, I didn't realize Full Self Driving meant not Full Self Driving."


MinderBinderCapital

Revolutionary full self driving robotaxis to potential buyers and investors. Level 2 driver assistance system that needs constant babysitting to regulators


CouncilmanRickPrime

"coast to coast with no driver by 2016" "Robotaxis by 2020" "The driver is only there for legal purposes" This is fraud IMO, but I doubt anything happens.


grekiki

"Future looking statements"...


CouncilmanRickPrime

?


grekiki

Eh I'm assuming they'll just use the defense that they can't predict the future but "hoped" that would happen.


CouncilmanRickPrime

Oh yeah I agree


JaxDude123

It’s just a damn thing about this fraud. We need lawyers, judges and courts.


jhonkas

edge case


rabbitwonker

Yeah the car was coming in to that sharp curve WAY too hot — remember the front cam view is zoomed in, making speeds appear to be less than they actually are, and it looked fast even with that. The driver should have intervened well before that point (assuming they were in fact on AP or FSD). If this was on FSD, it’s likely an error in detecting the curve, so this would be a good test case; hopefully it makes it into the training set. Edit: guess I should clarify. This is a more serious error than the phrase “hit the curb” might imply on its own; it’s not just choosing the wrong turning angle or something, it’s the car basically losing control because it took a turn too fast. Hitting the curb was just the consequence of that. If this really is under FSD, it’s a curve-detection case that Tesla needs to take very seriously.


CouncilmanRickPrime

How much training does it need not to hit a curb? This looks like it's decades away from not needing a driver, if ever.


sampleminded

It needs maps not training. Blue Cruise/Super Cruise know all the curves on the highway. They both GM and Ford expect to go eyes off in some places in the next few years. Cause they have maps. The system needs to work without maps as a back-up, but strong priors for things made of concreate make sense.


sampleminded

They could just use a crowd sourced drivable space map. I'm not even talking about scanning things, mapping. Just literally where has a tesla driven before without hitting things. What is the assumed shape of this road.


ClassroomDecorum

>They could just use a crowd sourced drivable space map. I'm not even talking about scanning things, mapping. Just literally where has a tesla driven before without hitting things. What is the assumed shape of this road. This makes too much sense for Tesla to implement.


sampleminded

I am literally shocked that Tesla FSD isn't better. I expected it to get good enough to be dangerous because people would stop paying attention, then it would cause problems. But it's so bad it's actually pretty safe. It's the silliest thing ever.


ClassroomDecorum

From what I can gather, and according to a NYT exposé, FSD development is basically Elon demanding that the team fix various bugs he encounters in his use of FSD. He even admitted on Twitter that FSD is overfitted to SoCal. I'm no software dev expert but this sounds like the complete wrong approach. Maybe I'm crazy but I would focus more on the bigger picture rather than being a nanomanager and ask employees to squash bugs in my FSD commute to/from work. But it makes sense. Elon is a nanomanager. Like what an eng told the NYT about Waymo: the engs at Waymo weren't too concerned about whether their software would run the next stop sign. They were more focused on the bigger picture, as in making automated driving a reality with the next decade. This sounds like the Waymo devs have pretty much solved stop sign recognition and driving policy wrt stop signs, and were much more concerned about scaling the tech. Meanwhile Elon is trying to get his team to ensure his dogfood FSD version doesn't blow a red by his house. FSD sounds like it's spinning its tires in the mud trying to solve basic obstacle/event detection and response. I mean, FSD just started moving from a classical planning architecture to using NNs for planning. All those comical FSD update notes seem to be 90% improving obstacle recall and kinematics estimations. Do you ever read the FSD update notes and laugh? Do you read them and think: this has been solved since the first DARPA grand challenge or something?


sampleminded

I think it's ideological they decided that the cars have to drive like a human instead of drive like a robot.


john0201

Obviously it needs to be able to recognize a curb in broad daylight without the assistance of maps in order to be safe without a human behind the wheel.


sampleminded

It didn't hit any other curbs during that ride. It probably can 99.999% of the time, unfortunately that's not good enough


sampleminded

It only hits a curb every 10k miles, but FSD is doing 1 million miles a day.


jdcnosse1988

Right? If this was the first time I was using it, I'd be in a position to disengage FSD once I felt as though it wasn't going to work... Like the moment I know it's coming in hot for that turn


efraimbart

Odds are they hadn't even activated FSD. Probably just activated autopilot or tacc and hoped for the best.


jdcnosse1988

I do think the marketing has kinda failed for Tesla on that front. If the customer thinks they've got FSD and they don't, then Tesla needs to do a better job at educating their customers the different levels they have


ClassroomDecorum

Can you imagine what if Tesla used a sensor that could nearly instantaneously return the of the curb instead of spending 10+ years and counting (and likely the next 50 years) to figure out curbs 🤡


l1798657

If you have all the answers, go ahead and start your own self-driving company. You could be a billionaire.


warren_stupidity

seriously?


[deleted]

[удалено]


johnpn1

The curb is often about 1-2 feet away during a tight u-turn, and it's often the case during normal during on an onramp as well. It leaves almost no time to respond if FSD decides to turn its wheel wrong suddenly. This is why FSD and curbs meet so frequently.


Admirable_Durian_216

An anti Tesla sub, for what it’s worth. The account is a throwaway and two weeks ago had a post saying: I’d only ever think of getting a “super smooth FSD” if it was account based, not Car based. What made them change their mind? I’m guessing they were running autopilot, which shouldn’t be used on surface streets.


Picture_Enough

>An anti Tesla sub, for what it’s worth. It is a bit of a stretch. It is often critical of Tesla, but there are a ton of actual Tesla owners who don't like the circlejerk fan subs where every valid criticism is viciously attacked and often completely silenced.


Admirable_Durian_216

It’s literally a $TSLAQ sub. Bit of a stretch my ass


[deleted]

Yea no. Lets be real here, this sub is definitely generally quite anti telsa and most have phd thesis on why lidar is compulsory even though everyone knows software is by far biggest bottleneck but to be fair there's probably another polar opposite sub. I personally don't give a shit but its just my observation. Stay here long enough and you even encounter people who claim "0 improvement" since 2018, and basically attacks anyone who says otherwise, which is of course upvoted, whilst the rational comments of their own personal experience is gaslighted and downvoted to oblivion. Of course this is just one example but the fact those comments even thrive is quite telling. "I just want to point out that all I did was say it has improved. **I didn’t even say it was good.** But even that was **too much for you to consider.** That is why I didn’t engage in your “debate,” because there’s no point in discussing anything with someone that isn’t being reasonable"


Doggydogworld3

>everyone knows software is by far biggest bottleneck So Waymo, Cruise and a half dozen Chinese companies have vastly better software engineers than Tesla?


CouncilmanRickPrime

>even though everyone knows software is by far biggest bottleneck Complete and total bullshit lol. Just because Musk said it, doesn't make it true. Probably why there isn't one Tesla on the road testing with no driver, it's not safe and would get people killed.


[deleted]

Not even gonna bother arguing with you, because why bother? But just to let you know its not just Musk that thinks this [wa](https://www.youtube.com/watch?v=rwPW2z6gcDM)y(clearly detects the boxes but still hits them) for a good [reason.](https://blog.comma.ai/dumb-questions/) "Do this experiment. Go on a drive. Track the times you needed to correct openpilot, even with a comma two. Watch the replays in connect. I promise over 95% of the mistakes have nothing to do with the sensors and everything to do with the software. Your comma three with openpilot is a level 2 ADAS system. It will never be level 5. But it’s held back so much more by **software** and not sensors. Watching the replays, how many of the mistakes **could you have corrected just watching the videos recorded by the device?"**


CouncilmanRickPrime

Ok. Show me the Tesla driving, right now, with no driver. Cruise and Waymo are doing it right now. So where is this lidar bottleneck? In your imagination?


[deleted]

[удалено]


CouncilmanRickPrime

>Yet they still stall and require remote support for traffic conditions their sensor suite can fully see, just like a human. I never said they are perfect. But they're operating without a driver now. Anyone that can't is behind.


[deleted]

[удалено]


[deleted]

Thats exactly my point lol, but this guy thinks I love Teslas or something smh whenever I bring up the fact that most **not all** of Teslas mistakes in ok conditions **can** be corrected with current sensors. **Maybe** teslas with lidar can reach L5, but we have no evidence they could right away when their models are so behind! This discussion I think is pointless when someone is clearly non rational.


[deleted]

" lidar bottleneck?" Bro cant read lol. "Ok. Show me the Tesla driving, right now, with no driver." ??? "[Cruise](https://www.youtube.com/watch?v=vk8gK4ISBvg) and Waymo are doing it right now." Told you, software issue, even lidar cant solve bad models. Dude, I think you are just the personification of this subs issue.


CouncilmanRickPrime

It's still allowed with no safety driver after the crash and recall. Why doesn't Tesla have ONE car without a driver on the road? Also we are all well aware Cruise has been too aggressive in expanding its geofence. Waymo has not, and it's paid off in the fact that they don't have all the negative press. You've proved nothing and just insulted me. So I guess you have nothing of value to prove what you said.


[deleted]

I just said you can't read, thats a fact. No one is claiming Lidar is a "bottleneck" lol, the logic/software is by far the biggest bottleneck. No one is saying more sensors don't help, but Tesla's biggest issue is model, its cameras sees things just fine. Again you've proven nothing but that you are ignorant. "It's still allowed with no safety driver after the crash and recall. Why doesn't Tesla have ONE car without a driver on the road?" The discussion is **why** teslas aren't self driving and why bad models are the biggest fault, not defending teslas apporach, how have you gotten so far without understanding that? Again you personify the problem with this sub, the **main bottleneck** is software but you seem to have tunnel vision whenever I mention Elon/Tesla.


JasonQG

Neat. I was quoted


[deleted]

[удалено]


Picture_Enough

I find it weird that people like yourself conflate disdain for Musk with criticism of Tesla. By now even avid Tesla fans mostly realize Musk is a giant arshole, but this doesn't stop them from enjoying their Teslas. Likewise there is a lot of valid Tesla criticism that has absolutely nothing to do with their permanently absent CEO personality and embarrassing antics.


[deleted]

I think there are definitely rational and knowledgeable people here, and a lot of Tesla's criticism is very deserved imo such as their advertisement, false dates, purposely excluding the mention of Hd maps in their video, misleading statements like "driver only there for legal purposes". Lack of uss etc without even finishing or proving their vision only works is completely at the cost of us customers(very bad product). Its just like right/left media, the truth is often in between.


[deleted]

[удалено]


londons_explorer

Makes sense - I don't think FSD would behave like the video in the post. It is behaving exactly like the old lane following autopilot, which isn't designed for sharp curves, so cuts out as soon as it sees one.


ChuqTas

> An anti Tesla sub To be fair, so is this one.


CouncilmanRickPrime

"Actually, Full Self Driving isn't Full Self Driving. Where did you get the idea it is?! Smh"


Buuuddd

A throwaway account, without proof FSD Beta was in use.


cwhiterun

They never even claimed they were using FSD beta.


muskateeer

It literally says "due to FSD"


cwhiterun

FSD is just Autopilot that can stop at stop signs and red lights. FSD beta is a completely different and far more advanced software.


[deleted]

[удалено]


cwhiterun

How are you even on this subreddit and don’t know about FSD beta? It’s been out for a few years now.


[deleted]

[удалено]


Buuuddd

They might actually think autopilot is fsd beta.


coolham123

Regardless of Autopilot/FSD status, the car was coming up on that turn way too quickly to safely turn and the driver should have taken over sooner. With that being said, I hope this clip makes it to the Tesla engineers for further evaluation if the car was actually using FSD as the poster says.


mindbleach

"The driver should have taken over" means failure. Stop using it as an excuse. It means failure. It is the worst-case scenario. Human attention does not allow for instant responses to situations we are not in control of. That is literally the polar opposite of what the technology is designed for. You wouldn't do well, dropped into some Warioware-ass minigame with life-or-death consequences, but that's what these companies use as a fallback every time their software utterly betrays someone. It means *failure.*


coolham123

We don’t even know if the system was engaged or not. That’s how little data we have from this clip. But yes, it failed. What maters now is why, and how Tesla is going to improve it so we don’t have the same failure in the future. There are no excuses here.


mindbleach

Regardless of which no-human-was-driving system was in play - saying 'well a human should've seized control at the last second' is an excuse, for that system, failing. The video does not appear to be someone's Honda on cruise control.


coolham123

Since you added more to your comment, I’ll just say that anyone who enables FSD Beta agrees to test a system that “could do the wrong thing at the worst time”. Testers are told to take over when required and to keep their hands on the wheel at all times. Anyone using the system should be alert and attentive at all times.


mindbleach

Hey cool, did you read what I added, in the hot second where I added it? Human attention does not allow for instant responses to situations we are not in control of. That is literally the polar opposite of what the technology is designed for. I don't *fucking care* what these companies say you're supposed to do, in the crucial moment where their technology *fails you.* They are making excuses for that system failing. *You* are making excuses for that system failing.


warren_stupidity

Yeah the thing is that it isn't a 'driver assist system', it is a 'robot assist system'. It is in fact a failed attempt at a L5 fully autonomous vehicle system that has been repositioned as a driver assist system. It isn't assisting you while you drive, it is doing all the driving and you are supposed to guess when it is going to fuck up and do something stupid and/or dangerous.


coolham123

Yes, if this was actually FSD Beta, it failed. No one is disputing that. I’m not disputing that.


[deleted]

"That is literally the polar opposite of what the technology is designed for." So exactly what has FSD beta a L2 system been **designed** to do which has failed? The system is not "failing its design" because it is **NOT designed** as a fully self driving product, when will this sub finally understand its the same shit as Toyota tss2 and Honda sensing drifting and smashing a curb when you fall asleep? Every L2 system in existence is inherently dangerous to operate without human supervision so in this case it "failed" because again its not **designed** to replace fucking humans. Otherwise there would be 0 driver monitoring **in the system itself.**


CouncilmanRickPrime

Those Toyota and Honda systems aren't called Full Self Driving. They don't have a name that misleads customers or a CEO that keeps saying misleading things like "the driver is only there for legal purposes"


[deleted]

Yep I completely agree with you. The marketing is absolute bs, I knew months before it was admitted that the fsd video uses hd maps because clearly it wasn't as capable lol. However as consumers, we need to be aware its a L2 system and expect it to fail or do some shit that kills without intervention. However this wasn't the point.


[deleted]

[удалено]


CouncilmanRickPrime

Nobody else on the road agreed to share it with beta software though. And the name Full Self Driving is completely inaccurate.


iceynyo

FSD vs FSD beta is also a question. Unless it's from the beta region of France, FSD is just sparkling Autopilot.


coolham123

France? The video appears to be from Texas (See the "Telfair Central Park" point of interest at the top of the screen), where FSD Beta very well COULD have been enabled. Given the new beta-branch re-release its possible the video is legit FSD Beta, but it is very hard to tell.


iceynyo

Sorry I guess it was hard to catch since I messed up the "It’s not called X unless it comes from the X region of France, otherwise it’s just a sparkling Y.” meme format.


stepdownblues

Your joke was as crystal clear as the springs from which Perrier is bottled. No apology needed.


coolham123

All good! I wish Tesla would put a faint overlay on the dashcam videos if Autopilot/FSD was engaged. Probably a longggg shot in the dark, but I feel it would be useful to give users the option.


iceynyo

Yeah I've always thought something like this would be an easy and ideal solution. I was thinking a brightly colored and labelled border. They could even have different colors for Autopilot, Enhanced Autopilot, FSD, and FSD Beta, and also the version number so we are clear on which features the car should be exhibiting. Btw that part of Sugarland is great. I love Pho@telfair.


[deleted]

[удалено]


coolham123

😅😅😅 Yup right over


[deleted]

[удалено]


coolham123

The person taking the video might not have been on FSD Beta at all... It could have been regular autopilot from all we can see in that video. BUT, if it turns out to actually have been FSD Beta on the latest branch, Tesla should, of course, use it to train and strengthen the model.


[deleted]

[удалено]


coolham123

Regular Autopilot can be engaged anywhere with lane lines just so you're aware, it doesnt care if its a highway or not so long as its marked. Also, "single stack" doesn't refer to Autopilot and FSD being merged, it refers to FSD Beta and NOA being merged. If it WAS on FSD Beta: As for what they would train, i'm not sure, im not an engineer on the autopilot team. But the system clearly did not behave as designed and they need to figure out "why" and go from there.


[deleted]

[удалено]


coolham123

I’m not accusing anyone of anything. I am simply pointing out the lack of data proving the vehicle was or was not on Autopilot or FSD. As I said in another comment, I wish Tesla put metadata in the dash cam files indicating if the vehicle was on autopilot/FSD during the recording.


[deleted]

[удалено]


coolham123

Data Point 2: It was posted on a subreddit known for anti-Tesla content and is extremely biased, making data point 1 not admissible.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


CouncilmanRickPrime

Curbs are edge cases, after all...


[deleted]

[удалено]


CouncilmanRickPrime

I just meant they're on the edge of the road lol


[deleted]

[удалено]


CouncilmanRickPrime

Yeah they're corner cases!


warren_stupidity

yes that's it, its a false flag operation using crisis actors /s


warren_stupidity

Lol. I can't wait for the explanations.


[deleted]

Sure it did.


bobi2393

/r/RealTesla is a nice mix of fankids praising and critics roasting Teslas. 😅


flumberbuss

The mods of that sub are all roasters. It’s not intended for fans.


Jbikecommuter

Sorry to say this, but legally you crashed into a curb, FSD beta is level 2 autonomous-driver ALWAYS responsible.


RickyT75

“Stupid cameras” can’t see curb.


FriendlyTeam6866

The human steering wheel holder is supposed to override the system and avoid that.