AI Day: Tesla FSD Hardware 4.0

By Nuno Cristovao

Tesla spent a good part of AI Day talking about their big improvements in vision and AI where they’ve been making amazing progress as shown with some of the latest FSD Betas. They also talked about Dojo, their new supercomputer used for AI which will become operational next year. They even announced a humanoid robot, Tesla Bot.

Tesla talks about FSD Hardware 4.0

However, one thing that wasn’t part of the official event, but we were able to find out more in the questions and answers portion of the event is Tesla’s FSD hardware 4.0 computer.

Tesla released FSD hardware 2.5 in 2017 and hardware version 3.0 was introduced in 2019. So we’ve been expecting an updated version of the FSD computer for a little while now.

All of Tesla’s vehicles today include Tesla’s hardware 3.0 computer, including the newly designed Model S. Normally, you’d see companies shy away from talking about new hardware that wasn’t part of the event, but Tesla had no problem answering questions about a possible new FSD computer.

When asked if there was a new FSD computer on the horizon, Elon Musk said that he expects the current hardware to be capable of autonomous driving safer than a human, but of course Tesla wants to continue expanding and improving so they’re always developing more powerful computers.

Elon guessed that we may see hardware 4.0 next year with the introduction of the Cybertruck. He hinted at hardware 4.0 providing about 4x the compute power of hardware 3.0. For those of you with Cybertruck reservations, this makes that vehicle even more exciting.

In another question related to new hardware, Elon also said that Tesla is looking into new cameras. He was clear that the current cameras aren’t a hindrance, but if they can think they can do better, then they would. He didn’t go into details whether this would be tied to the Cybertruck or hardware 4.0, but it seems likely. He also didn’t mention specifically whether they would be higher resolution, larger sensors, or possibly even better placement on the vehicle.

It’s been shown before that although the current camera placement on Teslas today allow for human-like capabilities and provide a similar view as a human being in the driver seat, it could be further improved by having cameras placed near the headlights looking off to the side.

Although it could be disappointing for some to see new hardware announced so soon after they bought a hardware 3 equipped vehicle, Tesla needs to continue advancing and new hardware may not necessarily provide new capabilities, but introduce a higher level of safety.

The Curious Case of Banish - What Happened to Tesla’s Self-Parking Feature?

By Karan Singh
Not a Tesla App

For years, Tesla owners have been intrigued by the promise of a truly hands-off parking experience, one that goes beyond simply just letting your car park itself when you arrive at the parking lot. Banish, sometimes also known as Banish Autopark or Reverse Summon, was envisioned as the ultimate parking convenience. Your Tesla would drop you off at the entrance to your destination, full chauffeur style, and then leave to find a suitable parking spot nearby. Coupled with Park Seek, your Tesla would drive through a parking lot to locate an open space and then park itself, waiting on standby.

Then, when you were ready, you would be able to Summon it to the entrance, showing up right as you do, for the smoothest autonomous experience. However, despite the initial excitement and focus from Elon back when V12.5 was supposed to include it, we’ve heard very little about Banish. It has remained a relatively elusive feature - and the last time we saw anything on it was all the way back in October 2024, when it was alluded to in some Tesla app code

So, what happened to Banish?

The Original Promise: A Smarter Way to Park

The concept of Banish was a logical extension of Tesla’s existing Summon and Autopark capabilities. Instead of just parking when a spot is identified by the driver, Banish and Park Seek were meant to give your Tesla more agency. After dropping off the occupants, your Tesla would leverage FSD and its autonomy to:

This functionality was often discussed in conjunction with improvements to Autopark and was highlighted as a step towards Tesla’s vision of a Robotaxi future. Interestingly, while the October 2024 FSD Roadmap mentioned Park, Unpark, and Reverse, it did not mention Banish. The lack of Banish as a milestone in the FSD Roadmaps leaves us to believe that Tesla has put this feature on the back burner while it works on other FSD-related priorities.

Today’s FSD & Autopark: Capable, But Not Quite Banish

Fast forward to Spring 2025, and FSD V13 does exhibit some tendencies in terms of self-parking capabilities. As noted by many on social media, FSD can identify and maneuver into parking spots when arriving at a destination. However, this is generally not the proactive Park Seek envisioned for Banish. The current system requires the driver to be present, even if hands-off. It often identifies spots as it directly approaches them, and its seeking behavior in a larger parking lot is extremely limited.

Users have also observed that while Tesla’s vision-based Autopark is often impressively accurate even on the massive Cybertruck, letting FSD nose-in to a spot can sometimes result in the car being poorly aligned or missing the lines entirely. This suggests that while your Tesla can park itself, the nuanced understanding and precision required for a truly reliable and Unsupervised Banish experience are still under development.

V13’s upcoming features indicate that it is supposed to provide additional support for personal and parking garages and driveways, which haven’t been added in quite yet. In fact, none of V13’s upcoming features have been realized yet - and it has been a while since a proper FSD update has come from Tesla.

The Underlying Tech is Ready

Interestingly, the core AI capabilities required for Banish and Park Seek are detailed extensively in a recently published Tesla Patent covering Autonomous and User Controlled Vehicle Summon to a Target. This patent describes generating an occupancy grid of the parking lot, then conducting path planning to the spot, and making decisions to safely navigate the lot at low speeds while keeping in mind pedestrians and other road users.

This indicates that Tesla has been working on the foundational AI for low-speed maneuvering in tight locations for quite some time. However, the challenge likely lies in achieving the necessary reliability, safety, and real-world robustness across an almost infinite variety of parking lot designs and in dynamic conditions.

What’s Next? Robotaxi.

The impending launch of Tesla’s Robotaxi Network in Austin in June brings the need for Banish-like capabilities into sharp focus. For a fleet of autonomous vehicles to operate efficiently, they must be able to manage their parking autonomously. A Robotaxi will need to drop off its passenger at the entrance to a location and then proceed to either its next pickup or autonomously find a parking or staging spot to await its next ride or even go back to base to charge.

It is plausible that a functional, robust version of Park Seek and Banish is being developed and tested internally as a component for Tesla’s Robotaxi launch and presumably what will be FSD Unsupervised. The initial rollout in Austin may just be the first real-world deployment of this tech from Tesla.

While Banish has yet to launch, the key components are in place and just need to be improved. The issue likely lies in safety, as parking lots account for 1 and 5 accidents that occur in North America.

In all likelihood, Banish isn’t canceled but is being integrated into an FSD Unsupervised and the Robotaxi feature set. That means a public rollout will likely depend on achieving a higher level of safety and confidence before Tesla is willing to let vehicles park themselves autonomously or even while being Supervised through the Tesla app.

For now, you’ll have to keep parking yourself, or letting FSD or Autopark do the job. A convenient curbside drop-off isn’t in the cards yet, but given the necessity for Robotaxi, it’ll need to arrive eventually.

Tesla's Smart Summon Patent Describes How It Works and Hints at Future Abilities

By Karan Singh
Not a Tesla App

Tesla’s Summon, Smart Summon, and Actually Smart Summon features have long been a source of fascination (and occasional frustration), offering FSD users a glimpse into a future where your vehicle picks you up.

While we await further improvements to Actually Smart Summon to increase reliability and range, a recently published Tesla patent (US20250068166A1) provides an inside look into the intricate AI and sensor technologies that make these complex, low-speed autonomous maneuvers possible.

Notably, the list of inventors on this patent reads like a "who's who" of Tesla's AI and Autopilot leadership, including Elon Musk and former Director of AI Andrej Karpathy, among many others.

Though the patent is a continuation of earlier work, with some dates stretching back to 2019, it lays out the core logic that powers Tesla's vision-based system.

Step-by-Step Navigation

Tesla’s patent details a sophisticated system designed to allow a vehicle to autonomously navigate from its current position to a target location specified by a remote user. The remote user can also designate themselves as the target, even while they’re moving, and have the vehicle meet them.

This process begins with destination and target acquisition. The system is designed to receive a target geographical location from a user, for example, by dropping a pin via the Tesla app. Alternatively, it can use a “Come to Me” feature, where the car navigates to the user’s dynamic GPS location. In this same section, the patent also mentions the ability to handle altitude, which is crucial for multi-story parking garages, and even handle final orientations at arrival.

Occupancy Grid

At the heart of the system is the use of sensor data to perceive the environment. This is done through Tesla Vision, which builds a representation of the surrounding environment, similar to how FSD maps and builds a 3D world in which to navigate. A neural network processes this environment to determine drivable space and generate an “occupancy grid.” This grid maps the area around the vehicle, detailing drivable paths versus obstacles.

The patent still references the use of alternative sensors, like ultrasonic sensors and radar, even though Tesla does not use them anymore. The system can also load saved occupancy grids from when the car was parked to improve initial accuracy.

Path Planner

Once the environment is understood, a Path Planner Module calculates an intelligent and optimal path to the target. This isn’t just the shortest route; the system uses cost functions to evaluate potential paths, penalizing options with sharp turns, frequent forward/reverse changes, or a higher likelihood of encountering obstacles. The path planning also considers the vehicle’s specific operating dynamics, like its turning radius. Interestingly, the Path Planner Module can also handle multi-part destinations with waypoints - a feature that isn’t available yet on today’s version of Actually Smart Summon.

Generating Commands

Once the path is determined, the Vehicle Controller takes the path and translates it into commands for the vehicle actuators, which control the steering, acceleration, and braking to navigate the vehicle along the planned route. As the vehicle moves, the Path Planner continues to recalculate and adjust the path as required.

Since Actually Smart Summon is nearly autonomous with the exception of the user having to hold the Summon button (app update hints at not having to hold the button soon), continuous safety checks are integral. This includes using the Path Planner and the occupancy grid to judge if there is a chance for a collision, and overriding navigation if necessary. The patent also mentions the possibility of users remotely controlling aspects like steering and speed but with continuous safety overrides in place. This is another cool little feature that Tesla has yet to include with today’s Actually Smart Summon - being able to control your full-size car like an RC car. This feature could be used for robotaxis if the vehicles get stuck and need to be tele-operated.

Reaching the Target

Upon reaching the destination, or the closest safe approximation (like the other side of a road), the system can trigger various actions. These include sending a notification to the user, turning on the interior or exterior lights, adjusting climate control, and unlocking or opening the doors. Another yet-to-arrive feature here is the fact that the destination triggers in the patent also include correctly orienting the vehicle for charging if the destination is a charger. This part of the patent doesn’t reference wireless charging, but we’re sure there’s more to this than it seems.

A Glimpse Into the Future

While this patent has dates stretching back to 2019, its recent publication as a continued application tells us that Tesla is still actively iterating on its Summon functionality. It details a comprehensive system that has been well thought out for complex, confined spaces, which will be key for both today’s convenience features like Actually Smart Summon - but also for Tesla’s upcoming robotaxis.

The depth of engineering described, from neural network-based perception to sophisticated path planning and safety protocols, explains the impressive capabilities of Tesla's Summon features when they work well and the inherent challenges in making them robust across an infinite variety of real-world scenarios. As Tesla continues to refine its AI, the foundational principles laid out in this patent will undoubtedly continue to evolve, actually bringing "Actually Smart Summon" to reality.

Latest Tesla Update

Confirmed by Elon

Take a look at features that Elon Musk has said will be coming soon.

More Tesla News

Tesla Videos

Latest Tesla Update

Confirmed by Elon

Take a look at features that Elon Musk has said will be coming soon.

Subscribe

Subscribe to our weekly newsletter