Discuss: Tesla Autonomously Delivers Its First Vehicle to Customer — And It’s More Impressive Than Expected [VIDEO]

Not a Tesla App

Administrator
Staff member
Aug 18, 2022
2,050
322
83

PrescottAZRichard

Well-known member
Oct 28, 2022
663
392
63
That is pretty cool. I’m thinking it was about 15 miles? Maybe a bit more, looks like most of the drive was around 30MPH. Thanks for explaining why the car was allowed (legally) to do this outside the geofenced Austin area, we were wondering which rules apply to such a drive. It is kinda strange that the rules are based on people INSIDE the car vs the car being on roadways with people inside other cars.
Now for my self centered reaction- great stuff, gimme a SFSD update please :) . Just got the 20.6 update for the Y but that’s a bug fix / security thing.
 
  • Like
Reactions: Procal

Procal

Member
Jun 12, 2024
68
50
18
That is pretty cool. I’m thinking it was about 15 miles? Maybe a bit more, looks like most of the drive was around 30MPH. Thanks for explaining why the car was allowed (legally) to do this outside the geofenced Austin area, we were wondering which rules apply to such a drive. It is kinda strange that the rules are based on people INSIDE the car vs the car being on roadways with people inside other cars.
Now for my self centered reaction- great stuff, gimme a SFSD update please :) . Just got the 20.6 update for the Y but that’s a bug fix / security thing.
I figured it had to be some sort of loophole, because I did drive for Uber and Lyft for a short while a few years back. And one of the requirements was to have some sort of insurance that covers the occupants in the car in the event of a collision. Kind of cool that this is possible, and it would make me wonder if we are going to see that cross-country unsupervised FSD drive sometime soon because of this loophole.
 
  • Like
Reactions: PrescottAZRichard

K.I.T.T.

Well-known member
Mar 26, 2024
438
287
63
Italy
It's cool, and it opens up to hub-and-spoke delivery where parking lots are still used but cars deliver by themselves - because all delivery centers I have been in require to reach by car anyway. A whole different car delivery experience anyway.
What's striking in all these FSD videos, though, is that the 10x speed doesn't allow understanding what really goes on. The car can be seen overtaken even by large transport trucks on the right - FSD going excessively slow or something else? The same goes for all FSD videos in Europe, announcing the novelty.
 

Procal

Member
Jun 12, 2024
68
50
18
It's cool, and it opens up to hub-and-spoke delivery where parking lots are still used but cars deliver by themselves - because all delivery centers I have been in require to reach by car anyway. A whole different car delivery experience anyway.
What's striking in all these FSD videos, though, is that the 10x speed doesn't allow understanding what really goes on. The car can be seen overtaken even by large transport trucks on the right - FSD going excessively slow or something else? The same goes for all FSD videos in Europe, announcing the novelty.
I'm not sure if you've seen the full drive video, which is essentially the whole drive with fewer cuts and slightly more camera angles. Here it is from the Tesla YouTube channel, in case you have not seen it. To me, it seems that the car was driving fine, and that the surrounding traffic was just driving really fast. Texas is known for having very fast highways where in some places the speed limit is 85 MPH, and sometimes you will find people going way over that, including some trucks. Either way, I would definitely prefer that the car would drive on the more cautious side, rather than have an accident on its way to me, especially if it is a new car I just bought. :LOL:
 

K.I.T.T.

Well-known member
Mar 26, 2024
438
287
63
Italy
I'm not sure if you've seen the full drive video, which is essentially the whole drive with fewer cuts and slightly more camera angles. Here it is from the Tesla YouTube channel, in case you have not seen it. To me, it seems that the car was driving fine, and that the surrounding traffic was just driving really fast. Texas is known for having very fast highways where in some places the speed limit is 85 MPH, and sometimes you will find people going way over that, including some trucks. Either way, I would definitely prefer that the car would drive on the more cautious side, rather than have an accident on its way to me, especially if it is a new car I just bought. :LOL:
I saw the video, it's impressive.
Howevere, it got overtaken to the right by the truck because it was needlessly occupying the left lane. So much for cautious driving.
We are not there yet, looks like too early for full autonomous drive.
 

Procal

Member
Jun 12, 2024
68
50
18
I saw the video, it's impressive.
Howevere, it got overtaken to the right by the truck because it was needlessly occupying the left lane. So much for cautious driving.
We are not there yet, looks like too early for full autonomous drive.
I'm of the mindset that this is a necessary step. To elaborate, I think it is a necessary to put this kind of stuff through its paces in the real world. You can have all the data, and all the simulations, and sensors you'd like, but I feel as though nothing can simulate real world driving. I'm glad that they performed this, since it means that they can now go back and fine tune their model more. Would I feel comfortable having this in my own car now? No, definitely not, at least not until they get approved for level 4 autonomy by the NHTSA or otherwise. However, to me this proves that they are getting close, and make decent progress towards the end goal. Think about it, just a year or two ago, everyone thought this day was still 5 years to 10 years away. And now here we are, and although it is not perfect, it is at least workable and quickly improving. I can absolutely see this being in many more cars, including personal cars (in North America) by mid to end of 2026 outside just Texas.
 
  • Like
Reactions: PrescottAZRichard

K.I.T.T.

Well-known member
Mar 26, 2024
438
287
63
Italy
I'm of the mindset that this is a necessary step. To elaborate, I think it is a necessary to put this kind of stuff through its paces in the real world. You can have all the data, and all the simulations, and sensors you'd like, but I feel as though nothing can simulate real world driving. I'm glad that they performed this, since it means that they can now go back and fine tune their model more. Would I feel comfortable having this in my own car now? No, definitely not, at least not until they get approved for level 4 autonomy by the NHTSA or otherwise. However, to me this proves that they are getting close, and make decent progress towards the end goal. Think about it, just a year or two ago, everyone thought this day was still 5 years to 10 years away. And now here we are, and although it is not perfect, it is at least workable and quickly improving. I can absolutely see this being in many more cars, including personal cars (in North America) by mid to end of 2026 outside just Texas.
That's the point, that's what supervised FSD was meant to do: collect data (which by the way they do regardless of FSD) and test it on the road with the reassurance of a driver taking over in case of troubles.

I know it's not the same because the stack is different, but my autopilot creates more danger with ghost brakes when another car accidentally exceeds their lane than anything else. And I really appreciate the latest upgrades, which make the ride much more comfortable, having almost eliminated the nag. If they fixed also the ghost brakes, I would be already happy enough.

Do we really need unsupervised and full-autonomous drive right now, so much that we can't wait a year or two for a safer solution? That's the main question. Anyway robotaxis are already out, so they don't even have first-mover advantage...
 

Procal

Member
Jun 12, 2024
68
50
18
That's the point, that's what supervised FSD was meant to do: collect data (which by the way they do regardless of FSD) and test it on the road with the reassurance of a driver taking over in case of troubles.
The issue here is that as long as there is a human able to intervene or disengage the system, then the training data will always have a “human bias”. Meaning, that the AI model will never learn how to handle more difficult tasks than those that drivers are comfortable allowing the system to perform. I'm not saying that people should be letting the cars crash or cause accidents or get into near miss scenarios. But a lot of people have different levels of confidence and preference in what they are okay with. For example, some people may not allow the car to get too close to other cars while attempting to auto park. Likewise, some people may never allow the car to miss a turn and then have to figure out a different route. And even more importantly, some people may simply not even hit that “submit video” button when the car does something it is not supposed to, or performs poorly. Again, there are a lot of factors that having a fallback like a supervising driver prevents the system from improving in edge cases, or in areas that it may need more work in.

Also keep in mind, that Tesla did also mention that they are basically having to start from scratch again, because of new larger parameters, and longer context windows. This means, that they likely require even more data now from completely unsupervised vehicles where there is even less intervention from an actual driver. It is likely one of the reasons that they are using safety monitors for this initial rollout.

Do we really need unsupervised and full-autonomous drive right now, so much that we can't wait a year or two for a safer solution? That's the main question. Anyway robotaxis are already out, so they don't even have first-mover advantage...
No, of course we don't. And that is why it isn't on personal vehicles yet. Although, Tesla is often criticized for their “move fast and break things” mentality when it comes to testing and getting things out. You can't really fault that they still do take safety into account. I doubt we will see unsupervised FSD out of Texas for a while, and if we do, it will likely be limited to the Robotaxi for quite some time. I also understand about the first-mover advantage, but they are a business after all, and it's better late than never. It is still a relatively small market, with a huge customer base, and a massive profit channel. Just think about it like this, in North America, Waymo only has ~2,000 cars across a few major cities (I think it's something like 10-15 cities total). And each Waymo costs ~$200,000 with all their sensors, compute, and the vehicle itself, not to mention that the vehicles are third-party. So if Tesla does manage to get into this market and can legitimately outcompete other brands in price, cost, and scalability, then the profits can indeed be huge.

Edit: I made a mistake, apparently Waymo only has between 600 and 1,000 vehicles across just a few cities (7 total), mainly Phoenix, San Francisco, LA, and now Austin. And their estimated cost per vehicle is between $150,000 and $250,000, and keep in mind this is after nearly 10 years of operation.
 
Last edited:

K.I.T.T.

Well-known member
Mar 26, 2024
438
287
63
Italy
All nice. Meanwhile, today I experienced what a couple of friends did when they got rear-ended "allegedly" for a Tesla's phantom break. I can't post the video here now but I have it.

I stop at a red light, accelerate to pass some cars, and - while I still have the accelerator pressed, which is crazy - the car overrides and brakes hard to a full stop. It took what looked like an eternity before I could move the car again. I was lucky nobody was behind, my wife got hurt by the seatbelt pull.

Now, I looked at the videos from all angles, and there is no f***ing reason why the car could even signal a danger, let alone take this dangerous and potentially deadly initiative.

I don't know now... I liked the idea of fast car updates, but now - with aviation mindset in mind - I'd rather have a predictable car with predictable flaws, than being a (paying) guinea pig.
 

Procal

Member
Jun 12, 2024
68
50
18
All nice. Meanwhile, today I experienced what a couple of friends did when they got rear-ended "allegedly" for a Tesla's phantom break. I can't post the video here now but I have it.

I stop at a red light, accelerate to pass some cars, and - while I still have the accelerator pressed, which is crazy - the car overrides and brakes hard to a full stop. It took what looked like an eternity before I could move the car again. I was lucky nobody was behind, my wife got hurt by the seatbelt pull.

Now, I looked at the videos from all angles, and there is no f***ing reason why the car could even signal a danger, let alone take this dangerous and potentially deadly initiative.

I don't know now... I liked the idea of fast car updates, but now - with aviation mindset in mind - I'd rather have a predictable car with predictable flaws, than being a (paying) guinea pig.
That is unfortunate that happened to you and your significant other. I'm not sure what to say about your constant issues with phantom braking. I can't say I've had the same experience since purchasing my Model 3, but then again, I've been using FSD since I purchased it and never really used Autopilot. Likewise, I also live in the US, so there is also the possibility that we are just getting a better version of both Autopilot and FSD where these issues have been long resolved or are much less frequent. Maybe it is an issue with your vehicle specifically? Possibly some sort of camera defect? Or more likely it is because you reside outside the US where these features are far behind the North American market? Hard to say, honestly.
 
  • Like
Reactions: PrescottAZRichard

K.I.T.T.

Well-known member
Mar 26, 2024
438
287
63
Italy
That is unfortunate that happened to you and your significant other. I'm not sure what to say about your constant issues with phantom braking. I can't say I've had the same experience since purchasing my Model 3, but then again, I've been using FSD since I purchased it and never really used Autopilot. Likewise, I also live in the US, so there is also the possibility that we are just getting a better version of both Autopilot and FSD where these issues have been long resolved or are much less frequent. Maybe it is an issue with your vehicle specifically? Possibly some sort of camera defect? Or more likely it is because you reside outside the US where these features are far behind the North American market? Hard to say, honestly.
From what I understand phantom braking is most commonly associated with autopilot and was more frequent in the past.

Nowadays "overzealous braking" happens at speed and when another vehicle exceeds their lane, and can be escaped by simply pressing the accelerator. It just gives you back control. It's very annoying but manageable.

It is the first time it happened like this, and it shouldn't be AP at fault, as it wasn't engaged, but the Automatic Emergency Brake, which can't be permanently disabled. The scariest part was that the car was unresponsive, pressing the accelerator did nothing. Thankfully we were at low speed and with few cars around.

Now, I could even understand the car trying to help the driver in an emergency, and needing to be overriden due to overzealous action... but not the opposite, that is creating an emergency situation and preventing the driver from fixing it to safety.