Tesla FSD v9, how it's going. We take a look at what v9 is capable of

By Henry Farkas

It’s presumptuous of me to write an article on Tesla’s FSD Beta v9 considering the sad fact that my Tesla Model 3 doesn’t have it. I’ve had the urge to do it anyway, but what could I say? Well, I’ve watched videos and read articles written by people who have it, and now I have something to say that no one has said. I’m going to compare Tesla to Mercedes Benz.

There are numerous people who now have access to the FSD v9 Beta, and many of them have now posted videos online to describe their experience. We take a look at some of those most interesting videos below.

All these reviewers mention that v9 is much smoother and more human-like in its driving leading to fewer disengagements. You’ll note that even though v9 is a better driver, it’s not yet a human-level driver. There’s no such thing as disengagement when a human is driving.

AI Addict took his Tesla with v9 down Lombard St. in San Francisco and here's how it did:

For those of you not familiar with San Francisco, Lombard St is extremely curvy and hilly. It’s also very narrow. It’s a one-way street, so you have to drive downhill. The last time I drove down that hill, I scraped my car on a cement curb. AI addict went down the street twice, and he had to take over both times. Clearly, Lombard St is an edge case that the neural network hasn’t mastered yet.

Here’s a v9 video in San Francisco that doesn’t go down Lombard St, but that does still need some human interventions.

Here’s a video by Dirty Tesla in downtown Ann Arbor. He did need to intervene a number of times.

So, basically, v9 is much better than v8, but it’s still not able to drive as well as the average human driver. So, what else is new? Here’s the kicker.

There’s an article in Engadget that describes the author's test track experience in a Mercedes Benz with level 3 self-driving. Benz is planning to release level 3 within a year.

Below you have the six levels of vehicle autonomy as defined by the SAE International group, which defined the standards the US goverment now uses.

SAE Internationa autonomy levels

Here’s the thing about the Mercedes Benz version of level three. It’s geofenced to limited access highways, and it’s velocity fenced to speeds of less than 60 kilometers/hour which translates to speeds of less than 37.2 miles an hour. So if you buy one of these cars for the level 3 self-driving feature, you’ll be able to use it only on limited-access highways during traffic jams. According to the video, you’ll be able to watch movies, play video games, and send texts while in traffic jams, that is unless the police see you doing those things. You’ll be able to keep your hands off the steering wheel, but you still have to be ready to take over instantly if the car decides it doesn’t know what to do. If you fall asleep or raise a newspaper high enough so the interior camera doesn’t know if you’re awake and alert, the level 3 self-driving feature will stop working.

So now, let’s get back to Tesla. Although I don’t have FSD Beta v9, I do have Navigate on Autopilot on my Model 3, and that’s what gets used on limited-access highways even on cars with FSD Beta v9. So here’s my experience on limited-access highways.

I do have to keep my hands on the steering wheel, and I’m not permitted to text, play video games, or watch movies even when I’m in a slow-moving traffic jam. Frankly, I wouldn’t feel safe doing those things while driving any car including a Mercedes. And keeping my hands on the wheel seems like the right thing to do in a car that might want me to take over at any moment. It would take one or two tenths of a second for me to get my hands onto the steering wheel if they were off the wheel when an emergency situation arose. It would take one or two seconds to figure out what to do if my mind was on a text, a video game or a movie when a disengagement happened. Even a tenth of a second could make the difference between a close call and an accidental crash.

For the most part, the only time I ever need to intervene while on a limited-access road is when my Tesla and I disagree on which is the most propitious travel lane. This sort of disagreement happens often enough that I have the settings adjusted so that the car needs my consent before it changes lanes.

But if you’re willing to let your Tesla decide which lane to travel in, then Tesla FSD non-beta is already more powerful than the Mercedes level 3. Yes, you need to keep your hands on the wheel and you’re not permitted to watch movies. But otherwise, the Tesla will drive itself as well as the Mercedes, and it will do that very well at full highway speeds, in stop and go traffic jams and everything in between. Level 3 self-driving below 37 MPH isn't an improvement over advanced autopilot.

Tesla's Autopilot is currently a level two driving feature, which basically means that it should be used as an aid to drivers, but that it does not drive on your own. If you've ever used Autopilot then you'll know that Tesla is right on the cusp of level three. With the FSD Beta, I believe Tesla is planning on pushing level three automation to everyone who has purchased the FSD package. Level three is where the FSD features break free and stop just being an aid and turn into an "autonomous" vehicle that the driver should pay attention to, and not just at 37 MPH.

Tesla’s Optimus Robot Learns to Walk Without Vision [VIDEO]

By Karan Singh
Optimus Falls - but catches itself!
Optimus Falls - but catches itself!
Not a Tesla App

Tesla recently showed off a demo of Optimus, its humanoid robot, walking around in moderately challenging terrain—not on a flat surface but on dirt and slopes. These things can be difficult for a humanoid robot, especially during the training cycle.

A Look Behind the Curtain

Most interestingly, Milan Kovac, VP of Engineering for Optimus, clarified what it takes to get Optimus to this stage. Let’s break down what he said.

Optimus is Blind

Optimus is getting seriously good at walking now - it can keep its balance over uneven ground - even while walking blind. Tesla is currently using just the sensors, all powered by a neural net running on the embedded computer. 

Essentially, Tesla is building Optimus from the ground up, relying on as much additional data as possible while it trains vision. This is similar to how they train FSD on vehicles, using LiDAR rigs to validate the vision system’s accuracy. While Optimus doesn’t have LiDAR, it relies on all those other sensors on board, many of which will likely become simplified as vision takes over as the primary sensor.

Today, Optimus is walking blind, but it’s able to react almost instantly to changes in the terrain underneath it, even if it falls or slips. 

What’s Next?

Next up, Tesla AI will be adding vision to Optimus - helping complete the neural net. Remember, Optimus runs on the same overall AI stack as FSD - in fact, Optimus uses an FSD computer and an offshoot of the FSD stack for vision-based tasks.

Milan mentions they’re planning on adding vision to help the robot plan ahead and improve its walking gait. While the zombie shuffle is iconic and a little bit amusing, getting humanoid robots to walk like humans is actually difficult.

There’s plenty more, too - including better responsiveness to velocity and direction commands and learning to fall and stand back up. Falling while protecting yourself to minimize damage is something natural to humans - but not exactly natural to something like a robot. Training it to do so is essential in keeping the robot, the environment around it, and the people it is interacting with safe.

We’re excited to see what’s coming with Optimus next because it is already getting started in some fashion in Tesla’s factories.

Is Tesla Close to Licensing FSD? GM Quits Cruise, BMW Praises Tesla

By Karan Singh
Not a Tesla App

In a relatively surprising move, GM announced that it is realigning its autonomy strategy and prioritizing advanced driver assistance systems (ADAS) over fully autonomous vehicles.

GM is effectively closing Cruise (autonomous) and focusing on its Super Cruise (ADAS) feature. The engineering teams at Cruise will join the GM teams working on Super Cruise, effectively shuttering the fully autonomous vehicle business.

End of Cruise

GM cites that “an increasingly competitive robotaxi market” and “considerable time and resources” are required for scaling the business to a profitable level. Essentially - they’re unable to keep up with competitors at current funding and research levels, putting them further and further behind.

Cruise has been offering driverless rides in several cities, using HD mapping of cities alongside vehicles equipped with a dazzling array of over 40 sensors. That means that each cruise vehicle is essentially a massive investment and does not turn a profit while collecting data to work towards Autonomy.

Cruise has definitely been on the back burner for a while, and a quick glance at their website - since it's still up for now - shows the last time they officially released any sort of major news packet was back in 2019. 

Competition is Killer

Their current direct competitor - Waymo, is funded by Google, which maintains a direct interest in ensuring they have a play in the AI and autonomy space.

Interestingly, this news comes just a month after Tesla’s We, Robot event, where they showed off the Cybercab and the Robotaxi network, as well as plans to begin deployment of the network and Unsupervised FSD sometime in 2025. Tesla is already in talks with some cities in California and Texas to launch Robotaxi in 2025.

GM Admits Tesla Has the Right Strategy

As part of the business call following the announcement, GM admitted that Tesla’s end-to-end and Vision-based approach towards autonomy is the right strategy. While they say Cruise started down that path, they’re putting aside their goals towards fully autonomous vehicles for now and focusing on introducing that tech in Super Cruise instead.

With GM now focusing on Super Cruise, they’ll put aside autonomy and instead focus solely on ADAS features to relieve driver stress and improve safety. While those are positive goals that will benefit all road users, full autonomy is really the key to removing the massive impact that vehicle accidents have on society today.

In addition, Super Cruise is extremely limited, cannot brake for traffic controls, and doesn’t work in adverse conditions - even rain. It can only function when lane markings are clear, there are no construction zones, and there is a functional web connection. 

The final key to the picture is that the vehicle has to be on an HD-mapped and compatible highway - essentially locking Super Cruise to wherever GM has time to spend mapping, rather than being functional anywhere in a general sense, like FSD or Autopilot.

Others Impressed - Licensing FSD

Interestingly, some other manufacturers have also weighed into the demise of Cruise. BMW, in a now-deleted post, said that a demo of Tesla’s FSD is “very impressive.” There’s a distinct chance that BMW and other manufacturers are looking to see what Tesla does next. 

BMW chimes in on a now-deleted post. The Internet is forever, BMW!
BMW chimes in on a now-deleted post. The Internet is forever, BMW!
Not a Tesla App

It seems that FSD has caught their eyes after We, Robot - and that the demonstrations of FSD V13.2 online seem to be the pivot point. At the 2024 Shareholder Meeting earlier in the year, Elon shared the fact that several manufacturers had reached out, looking to understand what was required to license FSD from Tesla.

There is a good chance 2025 will be the year we’ll see announcements of the adoption of FSD by legacy manufacturers - similar to how we saw the surprise announcements of the adoption of the NACS charging standard.

Latest Tesla Update

Confirmed by Elon

Take a look at features that Elon Musk has said will be coming soon.

More Tesla News

Tesla Videos

Latest Tesla Update

Confirmed by Elon

Take a look at features that Elon Musk has said will be coming soon.

Subscribe

Subscribe to our weekly newsletter