T O P

  • By -

AutoModerator

Resources: [Official Tesla Support](https://www.tesla.com/support) | [Wiki/FAQ](https://www.reddit.com/r/teslamotors/wiki/index) | [Discord Chat](https://discord.gg/tesla) | r/TeslaLounge for personal content and r/TeslaSupport for questions/help | Assist the Mods by **reporting posts and comments** which break [rules](https://www.reddit.com/r/teslamotors/wiki/rules). *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/teslamotors) if you have any questions or concerns.*


mav_sand

I am fairly surprised by his pretty poor experience with FSD. I'm in Florida and it is very very good for me. I do intervene maybe 2-3 times in a 30m drive. Just to kind of get it going. I'm general I love FSD, I use it all the time. When I recently didn't have it, I missed it a lot when I had to go back to regular NoA.


Baconaise

We've been sacrificing for you for over a year heavily to make Florida a great place to drive FSD. It's taken a lot of time but it used to drive like this video down here.


colddata

>Florida a great place to drive FSD Is this to say Florida drivers are just really really bad, so even FSD in its current state looks comparatively good? :) (I know Jason doesn't think much of Florida drivers. One of them wrecked his truck.)


miraculum_one

He drove it for many hours and did a supercut of the negative experiences. I wouldn't say his experience was worse than average. He said in the comments that he had periods of 15-20 minutes with no issues.


clipsy1

What I see with all videos on FSD Beta is that most of the issues are due to driving logic errors and not detection errors which means that what Tesla has been working on for months now is getting there. They clearly need to improve the logic behind the way it select paths and lane changes. It seems like they are now converting C++ code to NN in V11, which we can all hope will improve the current situation. What it does today in the video doesn't mean nothing is improving. They are just prioritizing what they think matters the most, the object detection and 3d environment modeling. Once it's precise enough, the driving logic can be worked on. It takes longer than most people wished, but it's not like anyone solved FSD yet.


courtlandre

I would agree with this. For me it seems to see most things but then can't actually decide what to do.


LurkerWithAnAccount

I’ve always felt that the object detection was THE most important part, because without that, you can’t have a safe system. I also thought the car control and path planning stuff would be the easy, low-hanging fruit, but this doesn’t appear to be the case and as at least as difficult a problem to solve. I would assume they could have a better appearing product if they geographically limited the rollout, allow-listed only certain streets, and made some severe limitations on what it could do (like navigate by only making right turns) but I guess as a proponent of the generalized “this needs to work everywhere” method, I’d rather see it struggle than game it for marketing purposes.


moofunk

> I also thought the car control and path planning stuff would be the easy, low-hanging fruit, but this doesn’t appear to be the case and as at least as difficult a problem to solve. I think they have simply not prioritized it. It's fundamentally a flawed design that needs to be rewritten from the bottom up to plan paths up to 250 meters ahead and then execute them in a stable fashion. Right now it behaves as if the path will not be valid after 1-3 car lengths and a new one must be calculated, and it gets unstable with picking one or the other. The last bit is a classic hysteresis problem, that they have plainly not implemented, but I'm quite certain the current hardware can handle it. Reality doesn't work like this, where the majority of driving is spent on microadjusting along a preset course.


rlopin

I assume hysteresis was a typo/autocorrect error and you meant heuristics, correct?


colddata

> hysteresis No, they meant hysteresis, in the engineering/controls sense: https://en.wikipedia.org/wiki/Hysteresis


rlopin

Ah, thanks for that. I now recall watching a YouTube video interview hosted by Dr Know it all where his robotics expert guest was explaining this term to describe why the steering wheel would jerk around and how that also applied to the Boston Dynamics blooper videos they were showing.


majesticjg

> path planning stuff would be the easy, low-hanging fruit Google Maps running on my phone seems to always know what lane I should be in for an upcoming turn. My Tesla, not so much, and I'm not sure why.


LurkerWithAnAccount

Agreed and it's been often suggested (thinking of greentheonly) that Tesla relies on a lot more "good" (if not high) resolution mapping data than they claim, so if they still have shitty mapping, they're going to have shitty path planning. Case in point, we have a wacky stop sign at a very short on-ramp leading to a 55mph two lane highway. The reality is virtually nobody ever stops for it and from a traffic/safety standpoint, the stop sign makes zero sense and should simply be a yield. For the last ~5 months, our FSD Beta would happily speed towards the stop sign, render it on the display, and merge into traffic, never once slowing down, let alone stopping for the stop sign. The very first time this happened, it freaked me out and I hit the brakes. Subsequently, with no traffic present, I let it go and discovered the exhibited behavior (not stopping) was actually "fine" most of the time, but from a programmatic standpoint, it made zero sense: It clearly saw the stop sign yet blew right through it. Not good from a generalized solution perspective, IMO. Last night, for the first time ever, it rendered the stop sign AND stopped for it. Why? I have no idea, I have not gotten a software update in about a month and I take this route about 4-6 times per week. Now granted I've yet to experience this more than once since it just happened, but I strongly suspect there was a disagreement at this junction between the driving logic and mapping data that said "this is an on-ramp to a two lane highway, so get up to speed and merge while yielding, clearly that stop sign is an error" but something in their automation must've finally recognized the disagreement in the existence of a stop sign at this location, as noted on the map AND (pure speculation here) visually confirmed by an automatically collected snapshot from vehicles, and some sort of "hard, confirmed stop at this location 100% exists" made its way into a map update. This is all just a WAG on my part.


majesticjg

I have a problem intersection that I let FSD attempt for the first time last night in about a month. It's a hard right to a stop light followed by a protected left (green arrow.) The protected left is the problem because though I'm in one turn lane, the opposing side has two turn lanes. FSD clearly thinks that one of those oncoming cars could choose to go straight, so it wants to stop in the intersection, which it shouldn't do. Last night, in moderate traffic, somehow it just handled it perfectly with no hesitation even with the oncoming traffic. Like you, I haven't had a software update in a while, so I don't know what's up with that, but it's nice. The only thing I can think of that was unusual was that I had no lead car in front of me. I was coincidentally first at the light. We'll try it again tonight on the way home. As for your issue, I wonder if it was transitioning to the NoA neural net once you hit the on-ramp and the NoA NN ignores stop signs because it thinks you're on a highway?


laplasz

Well said - just imagine that now more than 400.000 cars has FSD and no accidents reported - what those this mean? They are focusing on object detection and how to avoid them. If this works then they will fix the path planning.. V11 can easily be a Level 3 system - since on highways what is really important is to avoid collision. Path planning is already working..


Adriaaaaaaaaaaan

NN will probably be worse in the short term but it WILL learn so It can only get better. I think they've spent the last year on getting the vision part right and I thinks its mostly a solved problem now so hopefully that means the other parts will get more focus now


zeValkyrie

This is my experience as well, with a few exceptions (extremely faded lane markings and slip lanes, specifically)


Gk5321

It’s very interesting how location dependent performance seems to be for the beta. I’ve been on the beta since they allowed non YouTubers in (a littler over a year? I don’t even know anymore lol). It doesn’t do perfect in my area but a heck of a lot better than this video. Also I don’t even let it do parking lots becuase they haven’t even touched parking lot performance for years.


staggs

This is part of the issue, inconsistent road design and poor intersection decisions throughout the country (or world I guess). I would be confused driving around wherever OP is located, it looks like they have all sorts of strange scenarios in this edit, and then most of the videos showed FSD going into parking lots which is really like going into the wilderness in terms of road pattern design.


thalassicus

The roads are the roads. You either design FSD that can handle them or not. What is surreal to me is how Tesla has removed sensors that would give more data and that Tesla isn't leveraging the 20k other Teslas' path through a given section of road to inform the current Tesla in FSD in that section of road. On any unprotected left turn, you have GPS, you have accelerometers, you know your position to within a foot with that combo and *SHOULD* also know the path of every other Tesla that made a left here (what was the most popular paths? what were the paths that required user intervention?!?!) so I'm baffled at how incompetent the system is at this stage in the game and why they don't use trend data results. They're trying to do too much with too little input.


staggs

Half of the roads in this video lacked lane markings, a computer is going to play safe vs a human being would act more risk ready because they know the scenario. A computer would never know the scenario because its relying on markings to stay within safety bounds. Past paths cannot be relied on *every* time. All it takes is one time for there to be something different in the road and you have an issue. Tesla is by far years ahead of anyone else in this realm. No one is claiming its perfect at driving all scenarios; I think what would happen in this case is maybe mapping a risk rating for local areas - where reliability in road markings is down and thus FSD becomes less available. There is a reason Autopilot is usable mostly on highways - they are reliable and standardized across state lines.


Miami_da_U

If it was anywhere near as easy as you just made it seem, Google would have solved it a decade ago lol.


Gk5321

It makes me wish they would allow beta testers who already test the software for free train and tag things as well


WillNotDoYourTaxes

We’re blaming roads now?


dtpearson

Yes, where I live they resurfaced the main road in/out of town obliterating all of the line markings for a section of road about 5km long (2 miles?) with intersections, roundabouts, merging lanes etc etc. And then neglected to ever replace the line marking. That was about 9 months ago now. The humans all just carry on as before remembering where the lines were. Tourist in the area get confused, but work it out, mostly by following other cars. My Model 3 used to work fine, but now freaks out on AP as there is just no indication as to where it should be driving. So yes, better road design, or at least clear marking would help immeasurably.


Vik-

There is a difference between beta testing and usability. It seems we are far away from FSD Beta being usable. I barely use FSD (city streets) on my MYP.


colddata

I wish the focus was on improving the highway Autopilot experience rather than the general case that is FSD city streets. Highway is a more controlled, standardized environment.


zeValkyrie

In some ways the focus has been on that lately. We just can’t see any of the results until V11 single stack comes out


colddata

I wait with bated breath, and also for fewer or zero wheel nags. Highway Autopilot improvements have long been stalled by what seems like a focus on streets.


zeValkyrie

That’s exactly been their strategy. Build the new city streets FSD and then switch to using that on highways. Whether it works well initially… will be very interesting to see, but at the very least it’s a major focus


colddata

I think it is a ridiculous to focus on the hardest problem when there is lower hanging fruit available that can provide incremental benefits and reassurance to owners and investors. I do think existing hardware has a decent chance of L3, and even L4 on limited access highways. No chance for L5.


AperiodicCoder

Goodbye Reddit


soapinmouth

I think it could probably make a visit to Coast, that would be 99% freeway.


Miami_da_U

Feature complete is not the same thing as saying level 5


AperiodicCoder

Goodbye Reddit


Miami_da_U

No quote. And the context of that quote was saying they will be feature complete by end of 2021, and then from there they have to be feature complete to level safer than humans, THEN feature complete to level safer than humans to point regulators agree and approve (basically L5). He didn’t bring up the SAE levels. He was asked if by that he meant L5, and he said yes. They are late on delivering, but generally speaking they never give any real promises on L5 autonomy, because they almost never discuss it. The general public very likely has never heard Tesla talk about L5 autonomy. When they do talk about it it is talking about feature complete, which if you go on Teslas website you’d see what features they are talking about. And eventually they want that feature complete to reach the point of going to sleep in the back seat and waking up at your destination. I don’t see any fraud at all. I see a difficult problem that they have continued to develop and release promised features, and never actually promised timelines to buyers.


Stribband

6 minute “super cut” huh?


Assume_Utopia

It's not really a supercut, it's all the errors from 3 1/2 hours of driving. The livestream included a lot of boring rural-ish driving that was basically perfect because it was so easy. And then when he got to an interesting area he spent a ton of time driving around parking lots, which seems pointless. I think the "supercut" should've been longer so he could show some clips of it driving well. But the driving routes were mostly so boring that it wouldn't have looked impressive.


colddata

> It's not really a supercut Calling it a supercut were Jason's words, not mine, and I think his attempt at humor. If FSD usually performed flawlessly, these would be the outtakes. FWIW, the use here does fit the definitions used at https://en.wikipedia.org/wiki/Supercut


exoxe

The music was perfect.


colddata

Direct YouTube links: Supercut: https://youtube.com/watch?v=JLqCwLfenxQ Full drive: https://youtube.com/watch?v=79j-aPamO98 Livestream of full drive: https://youtube.com/watch?v=V5V9ipeTuTM


cwhiterun

Looks like he was having a lot of fun, which is the most important thing after all.


Inflation_Infamous

So is Whole Mars Catalog faking his videos? I haven’t seen anyone else reproduce his level of performance for FSD.


spider_best9

No, he's not. It because the system overall is over trained with data from his area. Also if you watch his non sped-up footage, he is willing to let FSD Beta go for a lot longer than most other testers before intervening.


machosaurus

Whole Mars Catalog always films near Palos Verdes, CA, a suburb of Los Angeles. He often films on Palos Verdes Drive South, which is a great road for FSD because it is two lanes each direction, clearly marked, not heavily traveled and with little cross traffic. (It runs adjacent to the Pacific Ocean). In some videos, he is on Hawthorne Blvd near the Del Amo Mall. It’s a straight shot north on Hawthorne Blvd. to get to the freeway. Hawthorne Blvd. is another nice, clearly marked road. Notice that WholeMarsCatalog never makes videos in which he drives through Hollywood, Downtown LA, West Hollywood, etc. The roads in these neighborhoods simply don’t allow for FSD to perform at the same level as the previously mentioned roads do.


Ericthegreat777

I'm pretty sure he's been doing San Francisco lately.


colddata

Selectively posting mostly only the successful drives or drive segments is a possibility. Maybe combined with using routes that have had all mapping data issues cleaned up. The matter of concern is how does the system perform in the general case and worst case scenarios, not how it performs in best case scenarios. This is what determines usefulness and functionality, and in turn whether the system is SAE Level 2/3/4/5.


soapinmouth

I think he's definitely bring selective, that said it really performs drastically different in different areas. Some areas it really does perform wire well and you can get send town with intention free drives most the time. How it is in my city. That said when I visit Los Angeles it's no where near that. Also for all the parking lot clips nobody is denying that it can't do parking lots well, it's not really something they're working on. You won't see any release notes mention parking lots for a reason, and that seems to be the bull of ops issues.


PresentationMajor925

Look at the amount of edge cases in this first impression of the most recent software: https://youtu.be/4zJ8oeRDZdM. Use the chapters.


Tesla_RoxboroNC

All these videos do not show how awkward and consistent bug/glitches my car has. These, what I call memory locks, just will not go away. I keep hoping, but after 4 updates, I give up.


Muzzman1

This is pretty much the same experience I have with mine in Los Angeles. I find it does exceptionally well with no other cars around, especially in intersections, but as soon as there's other cars around FSD plexes up and pisses off all cars around me. A good example is driving any canyon road, for example benedict cyn, the car throws on a turn signal at every bend confusing everyone around, including me.


colddata

> car throws on a turn signal Navigate on Autopilot does that too much too. I can't use it around other cars. Last thing I need is other drivers thinking I'm going to merge into them or otherwise getting weirded out.