Elon tweeted that v9 of the FSD beta would remove its reliance on radar completely and instead determine decisions based purely on vision. Humans don’t have radar after all, so it seems like a logical solution and tells us Tesla is feeling much more confident in their vision AI.
Radar and vision each have their advantages, but radar has thus far been much more reliable in detecting objects and determining speed. If you’ve ever noticed your Tesla being able to detect two vehicles in front of you when you can only see the one directly ahead of you, that’s radar at work.
In this situation the radio waves from the radar sensor are bouncing underneath the car in front of you and are able to continue traveling and detect that there is another object ahead even though it could never “see” it.
It really is one of those wow moments where you can feel the future and the ability for AI-powered cars to drive better than humans one day. It’s baby steps and slowly we’ll see more and more of these situations where the vehicle simply sees or does something we could never do.
There’s no doubting that more sensors could provide a more reliable and accurate interpretation of the real world as they each have their own advantages. In an ideal world a vehicle with radar, lidar, vision, ultrasonic sensors and even audio processing would provide the best solution. However, more sensors and systems come at a price, resulting in increased vehicle cost and system complexity.
After all humans are relatively safe drivers with two “cameras” and vision alone. If Tesla can completely solve vision, they’ll easily be able to achieve superhuman driving capabilities. Teslas have eight cameras, facing in all directions. They’re able to analyze all of them concurrently and make much more accurate interpretations then we ever could in the same amount of time.
Tristan on Twitter recently had some great insight into Tesla vision AI and how they’re going to replace radar. Here’s what Tristan had to say:
"We recently got some insight into how Tesla is going to replace radar in the recent firmware updates + some nifty ML model techniques
From the binaries we can see that they've added velocity and acceleration outputs. These predictions in addition to the existing xyz outputs give much of the same information that radar traditionally provides
(distance + velocity + acceleration).
For autosteer on city streets, you need to know the velocity and acceleration of cars in all directions but radar is only pointing forward. If it's accurate enough to make a left turn, radar is probably unnecessary for the most part.
How can a neural network figure out velocity and acceleration from static images you ask?
They've recently switched to something that appears to be styled on an Recurrent Neural Network.
Net structure is unknown (LSTM?) but they're providing the net with a queue of the 15 most recent hidden states. Seems quite a bit easier to train than normal RNNs which need to learn to encode historical data and can have issues like vanishing gradients for longer time windows.
The velocity and acceleration predictions is new, by giving the last 15 frames (~1s) of data I'd expect you can train a highly accurate net to predict velocity + acceleration based off of the learned time series.
They've already been using these queue based RNNs with the normal position nets for a few months presumably to improve stability of the predictions.
This matches with the recent public statements from Tesla about new models training on video instead of static images.
To evaluate the performance compared to radar, I bet Tesla has run some feature importance techniques on the models and radar importance has probably dropped quite a bit with the new nets. See tools like https://captum.ai for more info.
I still think that radar is going to stick around for quite a while for highway usage since the current camera performance in rain and snow isn't great.
NoA often disables in mild rain. City streets might behave better since the relative rain speed is lower.
One other nifty trick they've recently added is a task to rectify the images before feeding them into the neural nets.
This is a common in classical CV applications so surprised it only popped up in the last couple of months.
This makes a lot of sense since it means that the nets don't need to learn the lens distortion. It also likely makes it a lot easier for the nets to correlate objects across multiple cameras since the movement is now much more linear.
For more background on LSTMs (Long Short-Term Memory) see https://towardsdatascience.com/illustrated-guide-to-lstms-and-gru-s-a-step-by-step-explanation-44e9eb85bf21
They're tricky to train because they need to encode history which is fed into future runs. The more times you pass the state, the more the earlier frames is diluted hence "vanishing gradients".
Tesla’s FSD beta v9 will be a big improvement forward from what FSD beta users have been using where the system was still relying on radar. And it’ll be an even bigger leap from what non-beta testers currently have access to. We can’t wait. Now where’s that button?
Almost ready with FSD Beta V9.0. Step change improvement is massive, especially for weird corner cases & bad weather. Pure vision, no radar.
Ganesh Venkataramanan, Tesla’s project lead for its ambitious Dojo supercomputer project for the past five years, has left the company. Bloomberg reported this development, stating that the news was confirmed by sources familiar with the matter. Peter Bannon, a former executive at Apple Inc. and a director at Tesla for the last seven years, has now taken the helm of the project.
Venkataramanan's departure from Tesla last month is now stirring conversations about the potential impacts on Tesla's future initiatives. His contributions to the Dojo project have been pivotal, especially in designing the custom D1 chip that powers the supercomputer. Venkataramanan, with his extensive experience, including a significant tenure at Advanced Micro Devices Inc. (AMD), was a crucial asset in setting up Tesla’s AI hardware and silicon teams in 2016.
Dojo: A Cornerstone for Tesla’s Self-Driving Aspirations
The Dojo supercomputer is a critical element of Tesla's strategy to enhance its self-driving capabilities. Designed to train machine learning models integral to Tesla's autonomous systems, Dojo processes vast amounts of data captured by Tesla vehicles. This rapid data processing is essential for improving the company’s algorithms, with analysts suggesting that Dojo could be a significant competitive advantage for Tesla. In a recent estimation by Morgan Stanley, the project could potentially add $500 billion to Tesla’s value.
Elon Musk has been vocal about the company's commitment to the Dojo project, planning an investment exceeding $1 billion by the end of 2024. The project's importance was underscored in Tesla's decision to shift from relying on Nvidia Corp.’s supercomputers to developing Dojo, poised to rival systems from Hewlett Packard Enterprise Co. and IBM.
Looking Ahead: Impact and Future Prospects
The recent leadership changes raise questions about the future direction of the Dojo project. Venkataramanan's exit, coupled with the departure of another critical artificial intelligence player from Tesla last year, Andrej Karpathy, signals a transition period for the company’s AI and self-driving teams.
However, Tesla's robust talent pool, blending experienced and emerging professionals, offers a silver lining. Bannon's promotion to lead the Dojo project is seen as a strategic move, leveraging his experience and insights gained from his tenure at Apple. Moreover, the recent installation of Dojo hardware in Palo Alto, California, marks a step forward in centralizing and enhancing the project’s capabilities.
Tesla’s ambitions for Dojo extend to making it one of the world’s top supercomputers. The company envisions reaching a computational capability of 100 exaflops* by October 2024, a testament to its commitment to advancing artificial intelligence and self-driving technology.
* Confused about "exaflops?" "Flops" stands for Floating Point Operations Per Second. It's a way to measure how fast a computer can process data. "Exa" means a billion billion, or 1, followed by 18 zeros (1,000,000,000,000,000,000). So, when we say a computer can perform 100 exaflops, it can do 100 billion billion calculations per second. That's incredibly fast!
Tesla is adding a new 'High Fidelity Park Assist' feature in this year's Holiday Update
Following initial reactions to Tesla's 2023 Holiday Update, Elon Musk acknowledged the need for improvement, stating, "We need to step up our game." His post on X was followed by Tesla shedding more light on the Holiday Update than what was in the initial leak.
Call me old, but I remember a time when you bought a car, and that was it; the dealer and manufacturer didn’t give you anything else. So is the Tesla community acting a little bit spoiled here? Absolutely. But it also shows how high Tesla has set the bar with its previous Holiday Updates.
Initial Release and Feedback
The initial release of the 2023 Holiday Update, version 2023.44.25, received mixed reactions from the Tesla community, with some owners expressing disappointment over the lack of groundbreaking new features. But the newly announced features may serve as better stocking stuffers.
The initial rollout included something owners have been asking for, the blind spot monitor. The camera that turns on when you change lanes will now have a red color added if there is something in your blind spot. It’s not clear whether it will be accompanied by a tone.
Tesla’s blind spot warning in this year’s holiday update
Here are other features in the leaked update that are being tested by employees:
Navigation and Safety Features: Including symbols for speed cameras, stop signs, and traffic lights in navigation, and the automatic 911 call feature in case of an accident.
Trip Planning via Tesla Mobile App: Allowing for more detailed trip planning, including multiple stops and charging points.
Apple Podcasts Integration: Allowing users to sync with Apple devices for a seamless podcast experience, directly addressing the demand for a richer in-car entertainment system.
New Games and Enhancements: The update brought updates to Tesla Arcade, with Beach Buggy Racing and Polytopia Diplomacy updates, as well as the Vampire Survivors Chilling update.
Light Show Improvements: There’s a new light show that’s included with your vehicle. You’ll also be able to upload several light shows on a single USB drive and pick one from the vehicle, instead of having to use multiple USB drives, one for each light show.
More Live Sentry Mode Cameras: You will now be able to view the B-pillar cameras directly from the Tesla app. This brings the number of viewable cameras in the app up to seven. The only ones still missing are the alternative front-facing cameras that are telephoto and wide-angle, which wouldn’t bring much additional value. Although the B-pillars are viewable in the app with this update, they will still not be used to record during Dashcam or Sentry Mode events.
High-Fidelity Park Assist
Tesla's new parking assist feature will dynamically recreate scenes in real-time
In response to the feedback and Musk's statement, Tesla unveiled additional features in its updated holiday update, including an improved park assist with enhanced visualizations.
This feature provides a 3D reconstruction of the vehicle's surroundings while parking, akin to a 360-degree camera system found in other high-end vehicles. The system is clearly leveraging improvements to Tesla Vision to create the surrounding environment, such as cars, pillars and walls.
This feature also appears to change the color of objects depending on how close they are to your vehicle. In the image we can see the pillars are orange, but if we look closer, the object behind the vehicle is also orange near the bottom. The sides of the vehicles next to the Tesla also have a slight hint of orange, indicating their proximity.
However, it looks like this feature may be even better than it initially looks. The vehicles in the image aren’t just predefined 3D models that Tesla created, like the ones used in Autopilot visualizations. These models appear to be dynamically created using vision, so that no two cars would look alike, much more similar to what LiDar is able to achieve. The visualization provides a true representation of the environment around the vehicle. You can see that each vehicle is made up from layers and have blurred edges toward the rear, where the camera would have a hard time seeing.
These 3D models could be a sneak peek at the future of FSD visualizations.
High-Fidelity Park Assist Requirements
A big question on everyone’s mind is who will receive this new park assist feature. Tesla didn’t address this in their post on X besides providing a disclaimer that the features in the holiday release are subject to model and region availability. Tesla often likes to test features in select markets before making them available everywhere. It’s hard to say whether that will be the case here. There likely aren’t any legal ramifications around providing visualizations, so that’s a good sign that this feature will be available in most regions, either in the holiday update, or soon afterward.
However, there are still questions around which models or hardware will be required. From the image shared, we can see it’s offered on a Model Y, removing any speculation of it possibly requiring the HD radar in the new Model S/X. We also don’t think it will require FSD hardware 4.0, so the remaining questions are whether it requires MCU 3, or the FSD package.
Given that Tesla is calling this Park Assist, it doesn’t appear to be linked to Auto Park, which is a FSD package feature. When Tesla rolled out visual and audio alerts for vehicles without ultrasonic sensors, it called the feature Park Assist, and that was available to all owners.
Whether this improved Park Assist feature requires a vehicle with MCU 3 will depend on the level of processing power required. It’ll certainly require more than the current visualizations given that its building the scene in real-time, so we’re hopeful that it’ll work on MCU 2 vehicles too, but we just don’t have enough information right now to make the call.
Custom Lock Sounds
Soon you'll be able to choose a custom locking sound for your car
Not a Tesla App
Tesla also announced a fun and whimsical feature that allows owners to customize the lock sound of their Tesla. No longer will you need to listen to the car’s horn when it locks as you walk away. Now you’ll be able to customize the lock sound of the vehicle. Tesla is including several options, including sounds like a screaming goat, a jingle, a rubber ducky, a quack sound, an old school horn and applause. However, you’ll also be able to upload your own file to create a truly unique experience.
You can pick anything, from a bird’s tweet to a favorite video game sound. You’ll only be limited by the maximum upload file size, which according to a Tesla engineer, is a 1MB file in WAV format, which is roughly about 40 seconds at good quality.
This feature is possible due to the vehicle’s external pedestrian warning speaker. So if you have Tesla’s Boombox feature or your vehicle makes a sound when traveling under 20 MPH, then you should receive this fun enhancement.
Rear Seat Audio and Gaming
You'll now be able to play games on Tesla's rear screens
Enhancing the Tesla Arcade experience, passengers in the rear seats can now play games on the rear touchscreen. This feature, especially when paired with Tesla Arcade’s compatibility with PS4, PS5, Xbox Controllers, and rear-screen Bluetooth Headsets, is a welcome addition for families and long trips.
Much like the new Model 3, which received rear audio over Bluetooth support in the 2023.38 update, the new Model S and Model X will also receive this ability in the holiday update.
New Game - Castle Doombad
Tesla announced one other feature in the 2023 holiday update that hadn’t been previously leaked, and that’s a new game called Castle Doombad. Castle Doombad is a single player tower defense, puzzle-like game that’s currently available on iOS and Android, but has an upcoming release on PC and the Nintendo Switch. This game is expected to require MCU 3.
The rollout of the 2023 Holiday Update is expected to follow a similar timeline to last year. Tesla announced that the update will roll out starting next week. However, it’s not clear whether this will also include FSD Beta testers that are on a 2023.27 update.
Like a spoiled child on Christmas morning, Tesla owners still ask, “Is that it?” Well… possibly, but there may be more to look forward to early next year as Tesla builds off of the new High-Fidelity Park Assist feature.
Advanced Smart Summon: Upgrading the Smart Summon feature to be more intuitive and efficient, especially in complex parking scenarios.
Reverse Summon / Park Seek: What happened to Tesla dropping its passengers and driver off at the location and then finding a parking spot on its own?
Enhanced FSD Visualizations: Expanding the Full Self-Driving visualizations to more regions or models or completely recreating the FSD visualizations using the same neural networks Tesla is using for the High-Fidelity Park Assist feature.
TeslaFi logs your drives and charging sessions, letting you keep a log of your vehice's activity. We highly recommend checking them out if you use your car for business trips and would like to keep track of reimbursements, if you like to see how much you spend on charging or if you just love statistics. Visit their site and see everything they have to offer!
The EV Universe newsletter reports distill more than 100 EV news sources into a 10-minute read every week. We cover both Tesla and the rest of the EV industry. Join over 3,000 EV geeks like us and subscribe to the free weekly newsletter here.
Tesla Android Project enables you to run Android apps in your Tesla. The platform is Open Source and you can deploy it on your own Raspberry Pi 4. Consider supporting the initiative by donating or purchasing the Compute Module 4 Bundle that delivers the best experience. Get $20 off by using the code: NotATeslaApp
The official Tesla app only notifies you if your car is broken into. By installing Sentry Pro on your phone, you will be notified for all Sentry Mode events. Stay connected and avoid potential surprises by receiving notifications. Stop constantly checking the cameras to ensure safety. Check only when necessary, save battery and get peace of mind. Get a 7 day free trial here!