Tesla Activates In-Cabin Radar in Software Update 2025.2.6

By Not a Tesla App Staff
Not a Tesla App

Tesla has released software update 2025.2.6, and while minor updates typically focus on bug fixes, this one introduces a major new feature. With this update, Tesla has activated the in-cabin radar, a sensor that has been included in some vehicles for more than three years but remained unused until now.

Why Not Vision?

Unlike vision-based systems, radar can precisely measure object dimensions and even detect movement behind obstacles by bouncing radio waves off surrounding surfaces. This allows for more accurate and reliable measurements of objects that vision may not even be able to see, such as behind the front seats.

What Tesla Announced

Tesla recently highlighted the 4D radar in the new Model Y, explaining how it will improve passenger safety. Tesla executives stated that the radar would be used to properly classify passengers and improve the way airbags deploy.

Tesla went on to say that in a future update, Tesla will use the in-cabin radar to detect any potential passengers left in the vehicles. Since radar can even pick up on heartbeat and breathing patterns, it can provide a much more accurate method of detecting children left in a vehicle. Tesla talked about how the vehicle will send owners a notification via the Tesla app and enable the HVAC system if it detects a passenger in the vehicle. It’ll even call emergency services if needed.

New Feature in Update 2025.2.6

Tesla has officially named this feature in update 2025.2.6, “First-Row Cabin Sensing Update,” which appears to align with the first portion of what Tesla discussed in the new Model Y video.

In the release notes, Tesla describes the update as:

“The first-row cabin sensing system has been updated to use cabin radar, which is now standard in all new 2025 Model Ys. Your Model Y was built pre-equipped with the necessary hardware, allowing Tesla to also bring this technology to your vehicle.”

For now, it appears that Tesla is using the radar to detect and classify passengers in the front seats. This could eventually replace traditional seat sensors, reducing the number of hardware components and lowering production costs.

Not all Model Y vehicles with the cabin radar are receiving this feature yet. Tesla is likely testing it in select vehicles and will roll it out to more vehicles in the near future. The Model Y is also the only vehicle currently receiving this feature, even though additional models include the interior radar.

Tesla also plans to expand the feature later this year, bringing rear-seat passenger detection in Q3 2025. While Tesla talked about the feature for the new Model Y, we expect it to be available for all vehicles with the in-cabin radar.

Supported Models

Although Tesla is vague in their release notes, this feature is being added to all Model Ys that include a cabin radar. Tesla started including the cabin radar in 2022, but its availability may vary by region and model. The Model 3 didn’t receive the cabin radar until it was redesigned in 2024, while all Cybertrucks already include it.

The owner’s manual for the redesigned Model S and Model X doesn’t specifically mention the interior radar, although Greentheonly believes the vehicles also include one, so we’ll have to wait to determine whether those vehicles also receive this new feature.

At this time, the feature appears to be only going out to Model Y vehicles, but we expect it to become available on other supported models soon.

We love to see these kinds of updates. Tesla is increasing the safety of existing and new vehicles through a software update while also making them more affordable to own.

Tesla Robotaxi Improvements: Reduce Wait Time By Predicting Demand and Scale Operators

By Karan Singh
Not a Tesla App

Just over a week into the Robotaxi launch, Tesla began laying the groundwork for a more scalable remote supervision model, which will be key to achieving success with the Robotaxi Network.

About a week ago, Elon Musk posted on X that Tesla will likely reach the crucial safety threshold to enable this shift within a month or two. While that means at least another month of in-vehicle Safety Monitors, it does provide us with a timeline of what to expect.

This timeline came in response to a question about Tesla’s plans for the ratio of autonomous vehicles to remote supervisors. The more vehicles that a single human can supervise, the better, especially if that number can be reduced to something drastic, like a 100:1 ratio. A single human operator would be able to manage an entire city of Robotaxis, which will be critical to make the Robotaxi Network turn a profit.

While Tesla works towards that ambitious future, it is also taking immediate steps to improve the current user experience during the Austin pilot program, where 15-minute wait times have become the norm.

Solving for Wait Times

According to Eric E, one of Tesla’s principal engineers on Robotaxi, the current 15-minute wait times are a classic logistics challenge. The supply of vehicles is lower than the current demand for rides. To solve this, there’s a two-pronged solution for Tesla.

First, Tesla is directly increasing supply by hiring more Safety Monitors/Vehicle Operators in Austin, even hosting an on-site hiring event.

Second, Tesla is working to make FSD and the Robotaxi fleet management software faster and smarter. This means they are utilizing the data from the pilot to better orchestrate the fleet by predicting demand and pre-positioning vehicles in prime locations to reduce wait times. After dropping someone off, the vehicle can start traveling to areas of higher demand, even if someone hasn’t booked a ride yet.

Next Up: Remote Supervision

These immediate fixes are all in service of that much larger goal. Scaling the Robotaxi Network isn’t just about having more cars; it’s about increasing the number of vehicles a single human can safely supervise remotely, which is a requirement for Robotaxi to turn a profit.

Elon’s comments give us this timeline. A more flexible and favorable ratio of 3:1 (although still far from the ideal 100:1) is likely to be achieved within a few months.

Tesla is committed to safety, as evidenced by the safety monitors in the vehicle. A single incident could not only tarnish the public’s view of the Robotaxi Network but could also halt Tesla’s operations altogether.

The data gathered from more Robotaxis on the road is crucial to the whole project. Tesla is gathering more data and issuing newer FSD builds specific to the Robotaxi.

As FSD requires less remote oversight per mile driven autonomously, Tesla can safely increase the number of vehicles per remote supervisor, moving the service closer to its ultimate goal.

Tesla has laid out an aggressive roadmap for the Robotaxi Network and its next few phases. We’ll have to wait and see just how this goes over the next few months, and whether they feel comfortable enough to increase the geo-fence and remove safety monitors.

Tesla to Integrate xAI's Grok Into Optimus, Helping Bring the Robot to Life

By Karan Singh
Not a Tesla App

Following the recent news of Grok being almost ready for Tesla vehicles, Elon Musk confirmed on X that the next major step is with Optimus, Tesla’s humanoid Robot. xAI’s advanced Grok models will eventually serve as the voice and brain for Optimus. This will be a convergence of Musk’s two biggest AI ventures — Tesla and xAI.

This will combine a physically humanoid robot - the brawn - with the new brains, Grok. This integration is more than just giving Optimus a voice - it suggests that Tesla is thinking ahead and possibly intends to use Grok to understand the environment around Optimus, while FSD will handle the robot’s movements.

A Symbiotic Relationship

The combination of Optimus and Grok creates a relationship where each component plays to its strengths.

For years, Tesla’s robotics team has been focused on the immense challenge of physical autonomy. Optimus learns complex tasks by observing humans, basically training itself through video by watching humans. This helps Optimus develop the physical dexterity needed to work in the real world. This is the brawn - the ability to navigate, manipulate objects, and perform useful work.

Grok provides the conversational brain. It adds a layer of natural language understanding, reasoning, and interaction. Instead of needing a computer, a specialized app, or pre-programming commands to give Optimus instructions, a user will be able to simply talk to it in a natural way. This makes Optimus infinitely more approachable and useful, especially for tasks in a dynamic environment, such as work or at home.

xAI and Tesla

Viewed from a different perspective, this move isn’t just about upgrading one product. It is the clearest evidence that xAI and Tesla are collaborating together to build a single, unified AI platform. Musk’s biographer, Walter Isaacson, believes Tesla and xAI will merge. Seeing Tesla and xAI both play critical roles in creating Optimus makes us believe that it may very well be the case.

Transformation to a Humanoid Robot

The confirmation of Grok in Optimus is one of the most significant milestones for the project to date. While Optimus’s ability to walk and work (and dance) is already an incredible engineering feat, it has all been physical abilities so far. Adding the ability to interact with Optimus in a human-like way will transform Grok from a machine to a true, general-purpose humanoid robot.

The ability to understand nuanced requests, ask clarifying questions, and respond intelligently is what will ultimately make Optimus a daily fixture in our lives.

Latest Tesla Update

Confirmed by Elon

Take a look at features that Elon Musk has said will be coming soon.

More Tesla News

Tesla Videos

Latest Tesla Update

Confirmed by Elon

Take a look at features that Elon Musk has said will be coming soon.

Subscribe

Subscribe to our weekly newsletter