Tesla adds Driver Monitoring System, DMS to Model 3 and Model Y cars with a cabin camera

By Nuno Cristovao

Tesla Autopilot has always required and will require full driver attention for the foreseeable future. Tesla enforced this by detecting whether torque was being applied to the steering wheel.

Tesla's DMS

Tesla has relied on this method since the introduction of Autopilot, but unfortunately there have been several flaws with it for years.

Since Tesla is looking for a certain amount of force to be applied to the steering wheel, sometimes the car can ask for the driver’s attention even when the driver is attentive and their hands are on the steering wheel. The interval that the car checks for active participation has changed over the years, but with it being somewhere around 30-60 seconds, it can sometimes become an annoyance to drivers.

The second reason that detecting torque on the steering wheel doesn’t work well as a driver monitoring system is that it is easily defeated. There have been numerous devices that mimic the force of hands on the steering wheel and let's face it, it doesn’t lead to anything good for the driver or Tesla.

Tesla's Driver Monitoring System

Yesterday we saw the first vehicles to introduce a true driver monitoring system. For vehicles with a cabin camera, which include all Model 3 and Model Y vehicles, Tesla will be able to turn on their camera-based driver monitoring system (DMS). It appears that the feature is currently limited to the US and only for radar-less cars, but we expect this to change in the future. The release notes for 2021.4.15.11 state that:

Update: With the release of update 2021.32.5, Tesla has started to roll out driver monitoring to vehicles with radar as well. It is also expanding the feature to outside of the US.

The cabin camera above your rearview mirror can now detect and alert driver inattentiveness while Autopilot is engaged. Camera data does not leave the car itself, which means the system cannot save or transmit information unless data shared is enabled. To change your data settings, tap Controls > Safety & Security > Data Sharing on your car's touchscreen.

Much like Tesla implements FSD, the cabin camera will be recording and analyzing the video stream and attempting to detect several objects and driver attributes. Each attribute then gets a probability assigned to it. If the threshold is high enough for any given attribute that Tesla deems as the driver not paying attention, then the car can take additional action from there such as turn off Autopilot, pull over or ask the driver to pay attention.

According to GreenTheOnly on Twitter, the camera will be detecting whether the driver is looking down or to the side and tracking eye movement and detecting other things as well such as whether the driver is wearing sunglasses and how well the camera can see. Although Tesla is looking for a variety of distractions, it looks like they are only currently triggering alerts when the driver is on their phone and not looking at the road.

The first cars have started rolling out with this feature enabled and it hasn't replaced Tesla’s steering wheel torque detection, but is providing another layer of protection. We hope as Tesla expands its capability in the future it may one day replace having your hands on the Tesla wheel completely.

Tesla has also recently rolled out detecting whether there is a driver in the driver’s seat during the use of Autopilot. It’s possible Tesla will create an algorithm with all of these attributes and ultimately decide whether the driver is paying attention or not. This could greatly reduce the amount of times the driver is asked to pay attention when they already are and also increase safety by reminding us when we’re not.

Tesla’s Robotaxi Easter Egg: Surprise Tip

By Karan Singh
BLKMDL3

Tesla has always embraced whimsy in its software, packing it with playful Easter eggs and surprises. From transforming the on-screen car into James Bond’s submarine to the ever-entertaining Emissions Testing Mode and the fan-favorite Rainbow Road, these hidden features have become a signature part of Tesla’s software.

Of course, launching a new product like Robotaxi wouldn’t be complete without a fun little easter egg of its own. The end-of-ride screen in the Robotaxi app presents a familiar option “Leave a tip.”

For anyone pleased with their Robotaxi ride, they may be tempted to leave a tip. However, tapping the button presents our favorite hedgehog instead of a payment screen.

The app displays a message, alongside the familiar Tesla hedgehog, that simply states “Just kidding.”

While it's a fun prank, it’s also a nod to what Tesla really wants to do. They want to reinforce the economic advantage of an autonomous Robotaxi Network. Without a driver, there is simply no need to tip. The gesture is playful, but it’s a reminder of what Tesla’s real aim is here.

Even Elon is in on the joke. It is a small detail, but it’s all about those small details with Tesla.

First Recorded Tesla Robotaxi Intervention: UPS Truck Encounter [VIDEO]

By Karan Singh
@BLKMDL3 on X

Over the last few days, we’ve seen some exceptionally smooth performance from the latest version of FSD on Tesla’s Robotaxi Network pilot. However, the entire purpose of an early access program with Safety Monitors is to identify and learn from edge cases.

This week, the public saw the first recorded instance of a Safety Monitor intervention, providing a first look at how they’re expected to stop the vehicle.

The event involved a complex, low-speed interaction with a reversing UPS truck. The Safety Monitor intervened to stop the Robotaxi immediately, potentially avoiding a collision with the delivery truck. Let’s break down this textbook case of real-world unpredictability.

The Intervention [VIDEO]

In a video from a ride in Austin, a Robotaxi is preparing to pull over to its destination on the right side of the road, with its turn signal active. Ahead, a UPS truck comes to a stop. As the Model Y begins turning into the spot, the UPS truck, seemingly without signaling, starts to reverse. At this point, the Safety Monitor stepped in and pressed the In Lane Stop button on the main display, bringing the Robotaxi to an immediate halt.

This is precisely why Tesla has employed Safety Monitors in this initial pilot. They are there to proactively manage ambiguous situations where the intentions of other drivers are unclear. The system worked as designed, but it raises a key question: What would FSD have done on its own? It’s not clear whether the vehicle saw the truck backing up, or what it would do when it finally detected it. It’s also unclear whether the UPS driver recognized that the Robotaxi was pulling into the same spot at the exact same time.

It’s possible this wouldn’t result in a collision at all, but the Safety Monitor did the right thing by stepping in to prevent a potential collision, even one at low speed. Any collision just a few days after the Robotaxi Network launch could result in complications for Tesla.

Who Would Be At Fault?

This scenario is a classic edge case. It involves unclear right-of-way and unpredictable human behavior. Even for human drivers, the right-of-way here is complicated. While a reversing vehicle often bears responsibility, a forward-moving vehicle must also take precautions to avoid a collision. This legal and practical gray area is what makes these scenarios so challenging for AI to navigate.

Would the Robotaxi have continued, assuming the reversing truck would stop?

Or would it have identified the potential conflict and used its own ability to stop and reverse?

Without the intervention, it’s impossible to say for sure. However, crucial context comes from a different clip involving, surprisingly, another UPS delivery truck.

A Tale of Two Trucks

In a separate video posted on X, another Robotaxi encounters a remarkably similar situation. In that instance, as another UPS delivery truck obstructs the path forward, the Robotaxi comes to a stop to let its two passengers out just a few feet from their destination.

Once they depart, the Robotaxi successfully reverses and performs a three-point turn to extricate itself from a tight spot. That was all done without human intervention, by correctly identifying the situation. 

This second clip is vital because it proves that the Robotaxi's FSD build has the underlying logic and capability to handle these scenarios. It can, and does, use reverse to safely navigate complex situations.

A Valuable Data Point

Far from being a failure, this first intervention should be seen as a success for Tesla’s safety methodology. It shows the safety system is working, allowing monitors to mitigate ambiguous events proactively.

More importantly, this incident provides Tesla’s FSD team with an invaluable real-world data point.

By comparing the intervened ride with the successful autonomous one, Tesla’s engineers can fine-tune FSD’s decision-making, which will likely have a positive impact on its edge case handling in the near future.

This is the purpose of a public pilot — to find the final edge cases and build a more robust system, one unpredictable reversing truck at a time.

Latest Tesla Update

Confirmed by Elon

Take a look at features that Elon Musk has said will be coming soon.

More Tesla News

Tesla Videos

Latest Tesla Update

Confirmed by Elon

Take a look at features that Elon Musk has said will be coming soon.

Subscribe

Subscribe to our weekly newsletter