Sentry Mode, Tesla's camera-based car alarm and surveillance system, allows the vehicle to detect potential threats near the vehicle.
When Sentry Mode is enabled it records from four different cameras when the vehicle detects someone near the vehicle or an intrusion.
Although the feature is fantastic and has caught many situations when damage has been done to the vehicle, it can also lead to dozens of recordings when there is no threat at all. This is especially true if its raining, which can trigger Sentry Mode events, or if the vehicle is parked on a busy street with a lot of foot traffic.
With update 2022.44.2 Tesla is introducing two new features to Sentry Mode in some markets. Depending on your region, you'll either get the ability to disable camera-based detection or the ability to disable camera-based detection and the ability to adjust the length of each clip.
Camera-Based Detection
The first is the ability to turn off camera-based detection so that Sentry Mode will only save recordings that are triggered by intrusion, such as a glass breaking, a door opening, or the vehicle's tilt sensor being triggered (only available in some markets).
Turning off Sentry Mode recording when the cameras detect someone near the vehicle will greatly reduce the number of recordings, however, it could also lead to some threats going undetected.
Sentry Mode Clip Length
Tesla is adding improvements to Sentry Mode
Not a Tesla App
The second new feature of Sentry Mode is the ability to adjust the length of a recorded event. Once a potential threat is detected, Sentry Mode will record for the next several minutes unless additional threats are detected. The number of minutes that Sentry Mode records after a threat has been static up until now. With 2022.44.2, you'll now be able to adjust the length of each Sentry Mode clip.
This feature is currently limited to many parts of Europe. Tesla's release notes for those receiving both Sentry Mode improvements read as follows:
Sentry Mode now allows for even greater customization, including:
- Camera-Based Detection, which allows users to disable use of cameras to detect threats.
- Sentry Mode Clip Length, which allows users to specify the length of the clip when a potential threat is detected.To adjust these Sentry Mode settings, tap Controls > Safety > Sentry Mode.
Other Improvement Coming?
Two other Sentry Mode features have been rumored in the past when Elon Musk responded to users on Twitter.
While there is no guarantee that these features will be implemented, in general, it does show that Elon Musk thinks they're a good idea and he'll likely share them with the team. The outcome of whether these features get added largely depends on the feasibility of implementing such features, Tesla's roadmap, and the effort required to develop them.
Also in 2020, Elon replied to a user on Twitter suggesting that Tesla would allow an 'incognito' Sentry Mode. The goal is to not let perpetrators know that they're being recorded, although there could be potential legal ramifications to adding such a feature.
In update 2022.24 Tesla added the ability to disable Sentry Mode sounds when an intrusion is detected. This could have been Tesla's compromise when trying to create an incognito mode. Although the car will no longer make any audible noise, it will still flash its lights and display the Sentry Mode logo on the screen letting individuals know that they're being recorded.
Tesla recently added Sentry Mode support in Israel and expanded the countries where Sentry Mode Live Access is available. The feature is now available to users in Hong Kong, Taiwan, Australia, New Zealand, South Korea and Singapore.
Tesla recently showed off a demo of Optimus, its humanoid robot, walking around in moderately challenging terrain—not on a flat surface but on dirt and slopes. These things can be difficult for a humanoid robot, especially during the training cycle.
Most interestingly, Milan Kovac, VP of Engineering for Optimus, clarified what it takes to get Optimus to this stage. Let’s break down what he said.
Optimus is Blind
Optimus is getting seriously good at walking now - it can keep its balance over uneven ground - even while walking blind. Tesla is currently using just the sensors, all powered by a neural net running on the embedded computer.
Essentially, Tesla is building Optimus from the ground up, relying on as much additional data as possible while it trains vision. This is similar to how they train FSD on vehicles, using LiDAR rigs to validate the vision system’s accuracy. While Optimus doesn’t have LiDAR, it relies on all those other sensors on board, many of which will likely become simplified as vision takes over as the primary sensor.
Today, Optimus is walking blind, but it’s able to react almost instantly to changes in the terrain underneath it, even if it falls or slips.
What’s Next?
Next up, Tesla AI will be adding vision to Optimus - helping complete the neural net. Remember, Optimus runs on the same overall AI stack as FSD - in fact, Optimus uses an FSD computer and an offshoot of the FSD stack for vision-based tasks.
Milan mentions they’re planning on adding vision to help the robot plan ahead and improve its walking gait. While the zombie shuffle is iconic and a little bit amusing, getting humanoid robots to walk like humans is actually difficult.
There’s plenty more, too - including better responsiveness to velocity and direction commands and learning to fall and stand back up. Falling while protecting yourself to minimize damage is something natural to humans - but not exactly natural to something like a robot. Training it to do so is essential in keeping the robot, the environment around it, and the people it is interacting with safe.
We’re excited to see what’s coming with Optimus next because it is already getting started in some fashion in Tesla’s factories.
In a relatively surprising move, GM announced that it is realigning its autonomy strategy and prioritizing advanced driver assistance systems (ADAS) over fully autonomous vehicles.
GM is effectively closing Cruise (autonomous) and focusing on its Super Cruise (ADAS) feature. The engineering teams at Cruise will join the GM teams working on Super Cruise, effectively shuttering the fully autonomous vehicle business.
End of Cruise
GM cites that “an increasingly competitive robotaxi market” and “considerable time and resources” are required for scaling the business to a profitable level. Essentially - they’re unable to keep up with competitors at current funding and research levels, putting them further and further behind.
Cruise has been offering driverless rides in several cities, using HD mapping of cities alongside vehicles equipped with a dazzling array of over 40 sensors. That means that each cruise vehicle is essentially a massive investment and does not turn a profit while collecting data to work towards Autonomy.
Cruise has definitely been on the back burner for a while, and a quick glance at their website - since it's still up for now - shows the last time they officially released any sort of major news packet was back in 2019.
Competition is Killer
Their current direct competitor - Waymo, is funded by Google, which maintains a direct interest in ensuring they have a play in the AI and autonomy space.
Interestingly, this news comes just a month after Tesla’s We, Robot event, where they showed off the Cybercab and the Robotaxi network, as well as plans to begin deployment of the network and Unsupervised FSD sometime in 2025. Tesla is already in talks with some cities in California and Texas to launch Robotaxi in 2025.
GM Admits Tesla Has the Right Strategy
As part of the business call following the announcement, GM admitted that Tesla’s end-to-end and Vision-based approach towards autonomy is the right strategy. While they say Cruise started down that path, they’re putting aside their goals towards fully autonomous vehicles for now and focusing on introducing that tech in Super Cruise instead.
NEWS: GM just admitted that @Tesla’s end-to-end approach to autonomy is the right strategy.
“That’s where the industry is pivoting. Cruise had already started making headway down that path. We are moving to a foundation model and end-to-end approach going forward.” pic.twitter.com/ACs5SFKUc3
With GM now focusing on Super Cruise, they’ll put aside autonomy and instead focus solely on ADAS features to relieve driver stress and improve safety. While those are positive goals that will benefit all road users, full autonomy is really the key to removing the massive impact that vehicle accidents have on society today.
In addition, Super Cruise is extremely limited, cannot brake for traffic controls, and doesn’t work in adverse conditions - even rain. It can only function when lane markings are clear, there are no construction zones, and there is a functional web connection.
The final key to the picture is that the vehicle has to be on an HD-mapped and compatible highway - essentially locking Super Cruise to wherever GM has time to spend mapping, rather than being functional anywhere in a general sense, like FSD or Autopilot.
Others Impressed - Licensing FSD
Interestingly, some other manufacturers have also weighed into the demise of Cruise. BMW, in a now-deleted post, said that a demo of Tesla’s FSD is “very impressive.” There’s a distinct chance that BMW and other manufacturers are looking to see what Tesla does next.
BMW chimes in on a now-deleted post. The Internet is forever, BMW!
Not a Tesla App
It seems that FSD has caught their eyes after We, Robot - and that the demonstrations of FSD V13.2 online seem to be the pivot point. At the 2024 Shareholder Meeting earlier in the year, Elon shared the fact that several manufacturers had reached out, looking to understand what was required to license FSD from Tesla.
There is a good chance 2025 will be the year we’ll see announcements of the adoption of FSD by legacy manufacturers - similar to how we saw the surprise announcements of the adoption of the NACS charging standard.