Inside Tesla’s FSD: Patent Explains How FSD Works

By Karan Singh
Not a Tesla App

Thanks to a Tesla patent published last year, we have a great look into how FSD operates and the various systems it uses. SETI Park, who examines and writes about patents, also highlighted this one on X.

This patent breaks down the core technology used in Tesla’s FSD and gives us a great understanding of how FSD processes and analyzes data.

To make this easily understandable, we’ll divide it up into sections and break down how each section impacts FSD.

Vision-Based

First, this patent describes a vision-only system—just like Tesla’s goal—to enable vehicles to see, understand, and interact with the world around them. The system describes multiple cameras, some with overlapping coverage, that capture a 360-degree view around the vehicle, mimicking but bettering the human equivalent.

What’s most interesting is that the system quickly and rapidly adapts to the various focal lengths and perspectives of the different cameras around the vehicle. It then combines all this to build a cohesive picture—but we’ll get to that part shortly.

Branching

The system is divided into two parts - one for Vulnerable Road Users, or VRUs, and the other for everything else that doesn’t fall into that category. That’s a pretty simple divide - VRUs are defined as pedestrians, cyclists, baby carriages, skateboarders, animals, essentially anything that can get hurt. The non-VRU branch focuses on everything else, so cars, emergency vehicles, traffic cones, debris, etc. 

Splitting it into two branches enables FSD to look for, analyze, and then prioritize certain things. Essentially, VRUs are prioritized over other objects throughout the Virtual Camera system.

The many data streams and how they're processed.
The many data streams and how they're processed.
Not a Tesla App

Virtual Camera

Tesla processes all of that raw imagery, feeds it into the VRU and non-VRU branches, and picks out only the key and essential information, which is used for object detection and classification.

The system then draws these objects on a 3D plane and creates “virtual cameras” at varying heights. Think of a virtual camera as a real camera you’d use to shoot a movie. It allows you to see the scene from a certain perspective.

The VRU branch uses its virtual camera at human height, which enables a better understanding of VRU behavior. This is probably due to the fact that there’s a lot more data at human height than from above or any other angle. Meanwhile, the non-VRU branch raises it above that height, enabling it to see over and around obstacles, thereby allowing for a wider view of traffic.

This effectively provides two forms of input for FSD to analyze—one at the pedestrian level and one from a wider view of the road around it.

3D Mapping

Now, all this data has to be combined. These two virtual cameras are synced - and all their information and understanding are fed back into the system to keep an accurate 3D map of what’s happening around the vehicle. 

And it's not just the cameras. The Virtual Camera system and 3D mapping work together with the car’s other sensors to incorporate movement data—speed and acceleration—into the analysis and production of the 3D map.

This system is best understood by the FSD visualization displayed on the screen. It picks up and tracks many moving cars and pedestrians at once, but what we see is only a fraction of all the information it’s tracking. Think of each object as having a list of properties that isn’t displayed on the screen. For example, a pedestrian may have properties that can be accessed by the system that state how far away it is, which direction it’s moving, and how fast it’s going.

Other moving objects, such as vehicles, may have additional properties, such as their width, height, speed, direction, planned path, and more. Even non-VRU objects will contain properties, such as the road, which would have its width, speed limit, and more determined based on AI and map data.

The vehicle itself has its own set of properties, such as speed, width, length, planned path, etc. When you combine everything, you end up with a great understanding of the surrounding environment and how best to navigate it.

The Virtual Mapping of the VRU branch.
The Virtual Mapping of the VRU branch.
Not a Tesla App

Temporal Indexing

Tesla calls this feature Temporal Indexing. In layman’s terms, this is how the vision system analyzes images over time and then keeps track of them. This means that things aren’t a single temporal snapshot but a series of them that allow FSD to understand how objects are moving. This enables object path prediction and also allows FSD to understand where vehicles or objects might be, even if it doesn’t have a direct vision of them.

This temporal indexing is done through “Video Modules”, which are the actual “brains” that analyze the sequences of images, tracking them over time and estimating their velocities and future paths.

Once again, heavy traffic and the FSD visualization, which keeps track of many vehicles in lanes around you—even those not in your direct line of sight—are excellent examples.

End-to-End

Finally, the patent also mentions that the entire system, from front to back, can be - and is - trained together. This training approach, which now includes end-to-end AI, optimizes overall system performance by letting each individual component learn how to interact with other components in the system.

How everything comes together.
How everything comes together.
Not a Tesla App

Summary

Essentially, Tesla sees FSD as a brain, and the cameras are its eyes. It has a memory, and that memory enables it to categorize and analyze what it sees. It can keep track of a wide array of objects and properties to predict their movements and determine a path around them. This is a lot like how humans operate, except FSD can track unlimited objects and determine their properties like speed and size much more accurately. On top of that, it can do it faster than a human and in all directions at once.

FSD and its vision-based camera system essentially create a 3D live map of the road that is constantly and consistently updated and used to make decisions.

Ordering a New Tesla?

Consider using our referral code (nuno84363) to help support our site and get up to $2,000 off your Tesla.

Is Tesla Planning to Add Steam Support to All Vehicles?

By Karan Singh
Not a Tesla App

Yesterday, we reported that Tesla updated their Steam integration on Model S and Model X vehicles. The update was part of their 2024 Holiday Update, but it looks like there may be more to this than a simple update.

Steam, a video game library app, makes it easy for users to buy or launch games on their computers. However, a couple of years ago, Valve, who created Steam, launched their own standalone device, the Steam Deck. The Steam Deck runs a custom OS based on Linux.

Steam Launch

When Tesla launched the redesigned Model S and Model X, Tesla introduced a dedicated gaming GPU with 16GB of RAM and touted the ability to play top-tier PC games in Tesla vehicles.

In 2022, Tesla finally launched the Steam app for the Model S and Model X as part of its 2022 Holiday Update. The Steam app runs Steam OS, the same OS as the Steam Deck in a virtual environment.

However, earlier this year, Tesla stopped including the GPU and Steam (Beta) in their vehicles, and we haven’t seen any updates to the Steam in quite some time. In fact, we thought Tesla was axing their gaming-on-the-go dreams.

SteamOS Update

The Steam app, which is still in Beta, is getting an interesting update for the Model S and Model X vehicles with the discrete GPU.

Those vehicles received an update to SteamOS 3.6 - the same version of SteamOS that runs on the Steam Deck. While nothing has visually changed, there’s a long list of performance optimizations under the hood to get things running smoother.

Comparing Steam Deck to Tesla Vehicles

Let’s take a look at the Steam Deck - according to Valve, its onboard Zen4 CPU and GPU combined push a total of 2 TFlops of data, which is fairly respectable, but much lower than today’s home consoles. The Steam Deck is capable of 720p gaming fairly seamlessly on low-to-medium settings on the go and is also built on the AMD platform.

AMD-equipped Teslas, including the Model 3 and Model Y, are packing an older Zen+ (Zen 1.5) APU (processor with a combined CPU and GPU). AMD claims that the V1000 - the same embedded chip as on AMD Tesla vehicles (YE1807C3T4MFB), brings up to 3.6 TFLops of processing power with it, including 4K encoding and decoding with the integrated GPU on board.

While that’s not enough for 4K gaming or comparable to a full-blown console or desktop GPU, that’s enough raw horsepower for light gaming and is currently more powerful than the Steam Deck.

The Model S and Model X’s GPU brings that up to about 10TFlops of power - comparable to modern consoles like the Xbox Series X at 12 TFlops.

Steam Gaming for All Vehicles?

The fact that Tesla is updating SteamOS even though the feature is no longer available in any new vehicles could indicate that Tesla is not only bringing Steam back to Teslas but that it’s going to play a much bigger role.

While SteamOS is run in a virtual environment on top of Tesla’s own OS, we could see Tesla bring SteamOS to all of its current vehicles, including the Model 3, Model Y, and Cybertruck. Steam in these vehicles would likely support any game that’s capable of running on the Steam Deck.

We think this Steam update, which includes performance improvements and a variety of fixes, has quietly passed under most people’s radars. This could be a very exciting update for those who enjoy gaming, especially for those who love to do it in their Tesla.

Tesla Holiday Update Weather Features: All the Small Details

By Karan Singh
Robert Rosenfeld / YouTube

As part of Tesla’s 2024 Holiday Update, Tesla included two awesome new features - Weather at Destination and the long-awaited Weather Radar Overlay. These two features are big upgrades built upon the weather feature that was added in update 2024.26. The original weather feature added an hourly forecast, as well as the chance of precipitation, UV index, Air Quality Index, and other data.

However, this update also added some smaller weather touches, such as the vehicle alerting you if the weather at the destination will be drastically different from the current weather.

Not a Tesla App

Weather At Destination

When you’re navigating to a destination and viewing the full navigation direction list, the text under the arrival time will show you the expected weather next to your destination. You can also tap this, and the full weather pop-up will show up, showing your destination's full set of weather information.

Note the weather under the arrival time
Note the weather under the arrival time
Not a Tesla App

You can also tap the weather icon at the top of the interface at any time and tap Destination to switch between the weather at your current location and the weather at your destination.

You’re probably considering that the weather at your destination doesn’t matter when you’re three hours away - but that’s all taken into account by the trip planner. It will add in both charge time and travel time and show you the weather at your destination at your expected arrival time.

And if the weather is drastically different or inclement, such as rain or snow, while you’ve got sunshine and rainbows - the weather will be shown above the destination ETA for a few moments before it tucks itself away.

Tesla also recently introduced a new voice command. Asking, “What’s the weather?” or something similar will now bring up Tesla’s weather popup.

The weather pop-up above the ETA
The weather pop-up above the ETA
Not a Tesla App

One limitation, though—if you’re planning a long road trip that is more than a day of driving, the weather at destination feature won’t be available until you get closer.

Weather Radar Overlay

As part of the improvements to weather, Tesla has also added a radar overlay for precipitation. You can access the new radar overlay by tapping the map and then tapping the weather icon on the right side of the map. It’ll bring up a radar overlay centered on your vehicle. It’ll animate through the radar data over the last 3 hours so that you can see the direction of the storm, but you can also pause it at any point.

You’re able to scroll around in this view and see the weather anywhere, even if you zoom out. It also works while you’re driving, although it can be a little confusing if you’re trying to pay attention to the navigation system. If you like to have Points of Interest enabled on your map, the weather overlay will hide POIs except for Charging POIs.

Requirements / Data

Unfortunately, you’ll need Premium Connectivity for any of the weather features to work, and being on WiFi or using a hotspot will not be enough to get the data to show up. The data, including the weather radar, is provided by The Weather Channel.

As for supported models, weather and weather at destination are available on all vehicles except for the 2012-2020 Model S and Model X. The weather radar has more strict requirements and requires the newer AMD Ryzen-powered infotainment center available on the 2021+ Model S and Model X and more recent Model 3 and Model Y vehicles.

Latest Tesla Update

Confirmed by Elon

Take a look at features that Elon Musk has said will be coming soon.

More Tesla News

Tesla Videos

Latest Tesla Update

Confirmed by Elon

Take a look at features that Elon Musk has said will be coming soon.

Subscribe

Subscribe to our weekly newsletter