Elon Musk started Tesla's AI Day 2022 by saying, "I want to set some expectations with respect to our Optimus Robot," just before the doors opened behind him. A robot walked out, waved at the audience, and did a little dance. Admittedly a humble beginning, he explained, "the Robot can actually do a lot more than what we just showed you. We just didn't want it to fall on its face." Musk's vision for the Tesla Robot, "Optimus is going to be incredible in five years, ten years mind-blowing." The CEO said other technologies that have changed the world have plateaued; the Robot is just starting.
Tesla's CEO envisions Optimus eventually being like Commander Data, the android from Star Trek the Next Generation, except it "would be programmed to be less robot-like and more friendly." Undoubtedly there is a long way to go to achieve what Doctor Noonien Soong created in Star Trek TNG. What was demonstrated onstage wasn't at that level, but several videos throughout the presentation highlighted what the Robot is capable of at its very early stage in development. The audience watched the Robot pick up boxes, deliver packages, water plants and work at a station at the Tesla factory in Fremont.
Development over 8 Months
The breakdown of some of the systems of the Tesla Robot
Tesla (Edited by Not a Tesla App)
The first Robot to take the stage at AI Day was not Optimus, but Bumble C, another acknowledgement to The Transformers, as Bumble Bee played a significant role in that franchise. However, Bumble C is far less advanced than Optimus, who did appear later but was on a cart.
Several Tesla engineers took turns on the microphone describing some of the most complex elements of the project that was first announced one year ago. Perhaps the best description of the project was the company moving from building a robot on wheels to a robot on legs. However, that may be oversimplifying. For example, the car has two motors, and the Robot has 28 actuators.
Overall Design and Battery Life
Tesla's brightest demonstrated how the production has come to life over the past eight months. It seems this group of computer masterminds had to become anatomist experts as Tesla took hints from the human body to create a humanoid robot. That is an essential factor in creating Optimus. Everything people interact with is made usable by a human, with two legs, two arms, ten fingers etc. If the Robot differed from what the world is already designed for, everything would have to change. However, recreating the human body and its countless movements would take far too long, so Tesla has stripped it down to less than 30 core movements, not including the hand.
Like the human torso contains the heart, the Robot's chest holds the battery. It's projected that a single charge would provide enough for a full day's work with a 2.3-kilowatt-hour battery. All the battery electronics are integrated into a single printed circuit board within the pack. That technology keeps charge management and power distribution all in one place. Tesla used lessons learned from vehicle and energy production to create the battery allowing for streamlined manufacturing and simple and effective cooling methods.
Autopilot Technology
Tesla showed what the Robot sees, and it looked very familiar. That's because the neural networks are pulling directly from Autopilot. Training data had to be collected to show indoor settings and other products not used with the car. Engineers have trained neural networks to identify high-frequency features and key points within the Robot's camera streams, such as a charging station. Tesla has also been using the Autopilot simulator but has integrated it for use with the Robot programming.
Tesla shows off what the Optimus robot sees
Tesla (Edited by Not a Tesla App)
The torso also contains the centralized computer that Tesla says will do everything a human brain does, such as processing vision data, making split-second decisions based on multi-sensory inputs and supporting communications. In addition, the Robot is equipped with wireless connectivity and audio support. Yes, the Robot is going to have conversations, "we really want to have fun, be utilitarian and also be a friend and hang out with you," said Musk.
Motors Mimic Joints
The 28 actuators throughout the Robot's frame are placed where many joints are in the human body. Just one of those actuators was shown lifting a half-tonne nine-foot concert grand piano. There have been thousands of test models run to show how each motor works with the other and how to effectively operate the most relevant actuators for a task. Even the act of walking takes several calculations that the Robot must make in real-time, not only to perform but also appear natural. The robots will be programmed with a locomotion code; the desired path goes to the locomotion planner, which uses trajectories to state estimations, very similar to the human vestibular system.
Human hands can move 300 degrees per second and have tens of thousands of tactile sensors. Hands can manipulate anything in our daily lives, from bulky, heavy items to something delicate. Now Tesla is recreating that with Optimus. Six actuators and 11 degrees of freedom are incorporated into the robot hand. It has an in-hand controller that drives the fingers and receives sensory feedback. The fingers have metallic tendons to allow for flexibility and strength. The hands are being created to allow for a precision grip of small parts and tools.
Responsible Robot Safety
Musk wanted to start AI day with the epic opening scene from Terminator when a robot crushed a skull. He has heard the fears and people warning, "don't go down the terminator path," but the CEO said safety is a top priority. There are safeguards in place, including designs for a localized control ROM that would not be connected to the internet that can turn the Robot off. He sees this as a stop button or remote control.
Optimus Price
Musk said the development of Optimus may broaden Tesla's mission statement to include "making the future awesome." He believes the potential is not recognized by most, and it "really boggles the mind." Musk said, "this means a future of abundance. There is no poverty. You can have whatever you want in terms of products and services. It really is a fundamental transformation of civilization as we know it." All of this at a price predicted to be less than $20,000 USD.
One of the big undocumented changes in Tesla’s 2024 Holiday Update was the changes to the Energy app. While the Model S, Model X, and Cybertruck received the Consumption tab in the Energy app for the first time, the changes made for those models also carried over to Model 3 and Model Y.
The Consumption tab lets you view your vehicle’s consumption over recent trips as well as view projected range estimates based on historical usage, but it now offers different options.
Sadly, legacy Model S and Model X vehicles produced before the 2021 refresh still don’t have access to the Energy app at this time.
Energy App
Tesla’s Energy App previously let you view a lot of in-vehicle data on what is consuming energy and how to improve your energy consumption. It was previously refreshed in 2022 and brought Drive, Park, and Consumption tabs to help compare actual vehicle energy consumption versus what you’d expect from the EPA ratings.
The old Energy App's consumption page.
Not a Tesla App
Key Changes
The Energy App has seen a lot of changes - mostly in the name of simplicity and reducing confusion. Some changes reduce functionality, but others bring even more. All of these changes impact the Consumption tab - the Park and Drive sections are unchanged.
Distance
Previously, you were able to switch the graph on the Consumption tab to show the last 5, 15, or 30 miles. Instead, it is now a static display of the last 200 miles (or 300km). This means your last 200 miles of driving - whether it's a single trip or multiple trips. Your range prediction and energy usage are now based on 200 miles of driving instead of the previous selectable distance.
This allows for a more reasonable range prediction as small bursts of high-energy usage, such as time spent accelerating to highway speeds from an offramp, are now less of an impact and are instead averaged out by regular driving.
However, for those who love to take their Teslas to the track or tow regularly, this makes the consumption significantly less useful because you can no longer see your actual energy usage for the type of driving you’re doing. This could be fixed with a reset button or by adding the ability to select your distance — similar to before.
Projected Range and Average Wh/mi
Unfortunatley, the Instant Range button has been removed, and the graph is now locked on what was previously the Average Range. Essentially, you cannot view your real-time range based on current instantaneous consumption - but you can view the overall projected range.
Additionally, average Wh/mi and projected range are still displayed - but in different areas compared to before. The projected range is displayed on the center-left side of the graph, while the average Wh/mi is now displayed at the top of the screen.
Not a Tesla App
Compare Vs EPA
Another new feature is that the average range is now compared to the EPA estimated range in terms of wh/mi. You’ll be able to see whether your driving style and conditions put you over or under the EPA estimate in a pretty quick way, which is helpful.
This new comparison is located just under your average Wh/mi.
Small and minor adjustments to your driving style - like not taking off like an electric lightning bolt at every red light - will make a big difference to your range. Don’t worry - we know its hard, we love doing it too! Other things - such as driving downhill versus uphill, will have an impact that you can’t necessarily avoid unless you’re old enough that you went to school uphill both ways.
Total Vehicle Consumption
The final new feature is a total vehicle consumption number at the bottom left, under the chart. It will tell you how much energy you’ve consumed over the distance you’ve driven so far. This is a convenient way of seeing exactly how much energy you’ve used.
Bug
There’s currently a bug in the way the Y-axis is labeled. The yellow area of the graph means the vehicle is using energy, while green means the vehicle is generating energy through regenerative braking. However, the Y-axis is currently mislabeled and shows generated energy as using about 100-200 Wh/mi.
The confusion appears to be due to the dark gray line, which looks like “0” on the Y-axis but actually represents the vehicle’s rated range. We would expect this to be Y-axis 0, since above the line the graph is yellow, and underneath it, the graph is green. However, this line is at about the 240 Wh/mi mark but will vary depending on the vehicle.
Due to this bug, it’s currently not possible to see how much energy is being generated.
Dynamic Y-Axis
The Y-axis in the Consumption tab is now dynamic - it will expand and contract automatically based on the driving data. We’ve seen it go from 400 Wh/mi all the way up to 800 Wh/mi. You likely need to be in a Model S Plaid or Cyberbeast with Launch Mode to see numbers much higher than that.
We’re sad to see the X-axis get locked to 200 miles, but being able to see total vehicle consumption and comparing average consumption against what’s rated is equally, if not even more valuable.
Overall, the new and improved Consumption tab is simpler and doesn’t require user input. While it takes away some features, it makes it easier for drivers who may not use it regularly. The most important piece is the projected range, which is now easier to see and understand unless you're towing and need the historical usage erased because it’s now irrelevant to your current drive. Hopefully, Tesla will allow you to scrub the graph horizontally in the future, adding the ability for the user to adjust the X-axis dynamically.
If you hop into your Tesla and say ‘Hi’ or ‘Hello’ after pressing the Voice Command button, there is a good chance it’ll reply with “Hello!” This is the newest and most interesting piece of news pointing us to the conclusion that a Tesla voice assistant is on the way.
Previously, if you tried this, it would simply return “Command not understood.” This is the first time the vehicle is responding and interacting with the user.
Experience It Yourself
You’ll need to have your vehicle language set to English. Once that’s done, you can use the voice command button on your steering wheel or yoke - for the Model 3 and Model Y, push the right wheel button, and for the Model S, Model X, and Cybertruck, press the button. Then go ahead and say Hi or Hello.
The Hello! response may even have regional differences. For a German Tesla owner, after setting his language to English, the response came back as “Hallo.” We’re interested to see what the responses may be in other regions, so let us know if you notice anything interesting.
We’ve tried a few other basic things, but it seems that, for now, the vehicle only replies to a simple greeting. Asking it what time it is or the $TSLA stock price doesn’t seem to do much yet - unless you’re in China with the updated Smart Assistant.
Not a Tesla App
Server-Side Update
This update appears to be happening over Tesla’s voice system backend and doesn’t require the Holiday Update. Users who aren’t on the Holiday Update are reporting that they’re getting this new response as well.
We already know that Tesla interprets speech remotely, and the driver’s voice is not processed in the vehicle. Instead, the voice snippet is transmitted to Tesla’s servers, where Tesla processes it and sends a response back to the vehicle so that the vehicle can interpret it. This is unlikely to change with a smart assistant, as Elon Musk has already said that Grok will still process data server-side instead of on-device.
Many users recently also noticed significant improvements to voice commands, saying that the system understands them better and that responses now come back faster.
All of these things point to a new backend system for voice processing that Tesla is testing. It’s not unusual for a company to switch to a new backend process but keep the capabilities the same as the legacy system until it’s ready to roll out the new features. At that point, it’s simply a flip of a switch to allow the new capabilities.
The new smart assistant that was rolled out in China is mostly a backend change, with the in-vehicle experience largely remaining the same. The activation method (button press) and user interface remain the same. What changed is the response that comes back from the server, and the assistant gained a voice. The new voice we receive with a smart assistant could very well be the new voice users are experiencing in the navigation system in newer vehicles.
Below is a video of the voice assistant in China:
Vehicle Support - Intel?
When China received the Smart Assistant, it was locked to cars equipped with AMD Ryzen processors only. Shortly after its initial launch, it became available to older cars with Intel Atom processors as well.
However, we’re not sure whether it would apply to legacy Model S and Model X owners. A legacy vehicle owner had their vehicle report “Command not understood” when they tried the ’Hi’ voice command.
Grok for Tesla
Elon has previously mentioned that Tesla vehicles would receive Grok AI. Grok, as of yet, still doesn’t have live speech support like other LLM models such as OpenAI’s ChatGPT or Google’s Gemini. However, a major update to Grok just brought massively improved image generation via a new model called Aurora.
xAI has been hard at work improving Grok, and we’re sure that live speech support is on its way soon. Once that feature arrives on X, Tesla will likely be well-positioned to enable a Grok-powered smart assistant fleet-wide with a flick of a switch.