Track Mode turns a Tesla into a stunt car that could keep pace with the Fast and Furious franchise, even Toyko Drift. Except while Dom and the other gearheads would be under the hood adjusting or using a laptop and some nitrous for an extra jump, Track Mode enables users to make stability, braking and cooling changes with a few simple swipes of the screen.
Track Mode is available on Model 3 and Model Y Performance variants, as well as the Model S Plaid. However, Elon Musk has committed to Track Mode in the Model X Plaid. He's also said that it could be available for all Models, even those non-Performance models.
Track Mode first appeared in 2018 on the Model 3 Performance. But Track Mode V2, an updated version, was sent to Model 3 Performance vehicles via an over-the-air update in 2020 with several enhancements. Several YouTubers wasted no time taking their Model 3 with V2 to the track and showing the results. It's fair to say Track Mode surprised many people with its wide range of adjustments, ease of use and tire-eating capabilities. It left as many rubber marks as it did smiles in most videos.
Track Mode is completely software-based; however, there is hardware available for purchase on the Tesla website to ramp up the Model 3 Performance even further. The package includes 20-inch lightweight rims with XL Michelin Pilot Sport Cup 2 Tires, upgraded tire pressure sensors, and high-performance brake pads. Excellent addition but certainly not necessary.
Track Mode in Action
Track Mode Software
Now to the software update that beamed into Model 3 Performance vehicles in 2020. Users select Driving and then tap on Track Mode and enable it. The screen displays an overhead view of the car with green and blue colors on the components that will be used the most while driving the car hard and therefore need monitoring. These parts include the battery pack, the front and rear motor, the brakes, and the tires. A setting tab is also displayed that opens a menu, where the real fun begins. Telsa allows users to make drastic changes or minor tweaks, on the fly, right from the comfort of the vegan driver's seat.
Handling Balance
The first option is Handling Balance, which allows users to adjust the motor bias from the front to the back or vice versa. But that's not all. Like how you can change your car stereo speakers to blast from the rear, the front, or a blend somewhere in between, Track Mode has this set up for power to the motors. The software gives users 10 percent increments to move power to the front or rear motor. This adjustment addresses understeer or oversteer and will play a big part in creating burnouts or drifting around a corner.
Stability Assist
Speaking of drifting, Model 3s have won several safety awards, and many of those accolades may be due to its excellent stability control. The traction control reacts within ten milliseconds to a loss of traction, sending power to the other wheels to avoid slipping and sliding. Stability Assist in Track Mode can be adjusted from -10 to 10. There are several videos of drivers spinning out when turning the Stability Control to -10, not realizing how much the Tesla does to help the driver control the car.
Regenerative Braking
Track Mode also allows users to adjust how much regenerative braking occurs while on the course. Drivers can pick zero, which then goes up in 5 percent increments to 100 percent. Unfortunately, this isn't as useful as it sounds since the regenerative braking system helps prevent the braking system from overheating, a real threat during track time. Tesla recommends keeping regenerative braking at 100 percent, and there doesn't appear to be a way to turn it off completely.
Cooling Features
That's it for sliders, but there are also on/off buttons for Post Drive Cooling, Compressor Overclock and Save Dashcam for Laps. Post Drive Cooling and Compressor Overclock are utilized to decrease components' heat after a track session. While these seem like easy decisions to have turned on, Tesla warns that using the Overclock Compressor will reduce the part's lifespan.
Lap Times and Dashcam
Track Mode can record lap times with you with video footage
DragTimes/YouTube
As for Save Dashcams for Laps, that opens another element of Track Mode. After closing the settings and returning to the usual navigation screen, users can tap and hold on to the icon representing the car, which will set the finish line. After pushing start, the vehicle will use the location to start and stop lap times. When passing the finish line for the first time, the system puts the course in blue on the screen so drivers can follow their exact path. The screen shows the lap number and lap times. This information, video and telemetry data can then be downloaded and viewed on a computer. There is a lot of data, including vehicle thermals, tire use, acceleration and deceleration rates, and the G-meter. Yes, Track Mode also displays the G-forces on the car.
Tesla Warns Users
For all the above reasons, Tesla warns that Track Mode is designed for closed circuit courses. The company states: "It is the driver's responsibility to drive safely and ensure others are not endangered. Track Mode is designed for use by experienced track drivers familiar with the course. Do not use on public roads. It is the driver's responsibility to be in control of the vehicle at all times, including on the track. Because vehicle behavior (including traction and stability control) differs when using Track Mode, always use caution."
Subscribe
Subscribe to our newsletter to stay up to date on the latest Tesla news, upcoming features and software updates.
With Tesla’s first major expansion of the Robotaxi Geofence now complete and operational, they’ve been hard at work with validation in new locations - and some are quite the drive from the current Austin Geofence.
Validation fleet vehicles have been spotted operating in a wider perimeter around the city, from rural roads in the west end to the more complex area closer to the airport. Tesla mentioned during their earnings call that the Robotaxi has already completed 7,000 miles in Austin, and it will expand its area of operation to roughly 10 times what it is now. This lines up with the validation vehicles we’ve been tracking around Austin.
Based on the spread of the new sightings, the potential next geofence could cover a staggering 450 square miles - a tenfold increase from the current service area of roughly 42 square miles. You can check this out in our map below with the sightings we’re tracking.
If Tesla decides to expand into these new areas, it would represent a tenfold increase over their current geofence, matching Tesla’s statement. The new area would cover approximately 10% of the 4,500-square-mile Austin metropolitan area. If Tesla can offer Robotaxi services in that entire area, it would prove they can tackle just about any city in the United States.
From Urban Core to Rural Roads
The locations of the validation vehicles show a clear intent to move beyond the initial urban and suburban core and prepare the Robotaxi service for a much wider range of uses.
In the west, validation fleet vehicles have been spotted as far as Marble Falls - a much more rural environment that features different road types, higher speed limits, and potentially different challenges.
In the south, Tesla has been expanding towards Kyle, which is part of the growing Austin-San Antonio suburban corridor spanning Highway 35. San Antonio is only 80 miles (roughly a 90-minute drive) away, and could easily become part of the existing Robotaxi area if Tesla obtains regulatory approval there.
In the East, we haven’t spotted any new validation vehicles. This is likely because Tesla’s validation vehicles originate from Giga Texas, which is located East of Austin. We won’t really know if Tesla is expanding in this direction until they start pushing past Giga Texas and toward Houston.
Finally, there have been some validation vehicles spotted just North of the new expanded boundaries, meaning that Tesla isn’t done in that direction either. This direction consists of the largest suburban areas of Austin, which have so far not been serviced by any form of autonomous vehicle.
Rapid Scaling
This new, widespread validation effort confirms what we already know. Tesla is pushing for an intensive period of public data gathering and system testing in a new area, right before conducting geofence expansions. The sheer scale of this new validation zone tells us that Tesla isn’t taking this slowly - the next step is going to be a great leap instead, and they essentially confirmed this during this Q&A session on the recent call. The goal is clearly to bring the entire Austin Metropolitan area into the Robotaxi Network.
While the previous expansion showed off just how Tesla can scale the network, this new phase of validation testing is a demonstration of just how fast they can validate and expand their network. The move to validate across rural, suburban, and urban areas simultaneously shows their confidence in these new Robotaxi FSD builds.
Eventually, all these improvements from Robotaxi will make their way to customer FSD builds sometime in Q3 2025, so there is a lot to look forward to.
For years, the progress of Tesla’s FSD has been measured by smoother turns, better lane centering, and more confident unprotected left turns. But as the system matures, a new, more subtle form of intelligence is emerging - one that shifts its attention to the human nuances of navigating roads. A new video posted to X shows the most recent FSD build, V13.2.9, demonstrating this in a remarkable real-world scenario.
Toll Booth Magic
In the video, a Model Y running FSD pulls up to a toll booth and smoothly comes to a stop, allowing the driver to handle payment. The car waits patiently as the driver interacts with the attendant. Then, at the precise moment the toll booth operator finishes the transaction and says “Have a great day”, the vehicle starts moving, proceeding through the booth - all without any input from the driver.
If you notice, there’s no gate here at this toll booth. This interaction all happened naturally with FSD.
While the timing was perfect, the FSD wasn’t listening to the conversation for clues (maybe one day, with Grok?) The reality, as explained by Ashok Elluswamy, Tesla’s VP of AI, is even more impressive.
It can see the transaction happening using the repeater & pillar cameras. Hence FSD proceeds on its own when the transaction is complete 😎
FSD is simply using the cameras on the side of the vehicle to watch the exchange between the driver and attendant. The neural network has been trained on enough data that it can visually recognize the conclusion of a transaction - the exchange of money or a card and the hands pulling away - and understands that this is the trigger to proceed.
The Bigger Picture
This capability is far more significant than just a simple party trick. FSD is gaining the ability to perceive and navigate a world built for humans in the most human-like fashion possible.
If FSD can learn what a completed toll transaction looks like, it’s an example of the countless other complex scenarios it’ll be able to handle in the future. This same visual understanding could be applied to navigating a fast-food drive-thru, interacting with a parking garage attendant, passing through a security checkpoint, or boarding a ferry or vehicle train — all things we thought that would come much later.
These human-focused interactions will eventually become even more useful, as FSD becomes ever more confident in responding to humans on the road, like when a police officer tells a vehicle to go a certain direction, or a construction worker flags you through a site. These are real-world events that happen every day, and it isn’t surprising to see FSD picking up on the subtleties and nuances of human interaction.
This isn’t a pre-programmed feature for a specific toll booth. It is an emergent capability of the end-to-end AI neural nets. By learning from millions of videos across billions of miles, FSD is beginning to build a true contextual understanding of the world. The best part - with a 10x context increase on its way, this understanding will grow rapidly and become far more powerful.
These small, subtle moments of intelligence are the necessary steps to a truly robust autonomous system that can handle the messy, unpredictable nature of human society.