Tesla's Autopilot and AI team must be burning the 3 am oil again. Elon Musk took to Twitter to tell the world the expectations for his team on some significant projects. He tweeted: Note, Autopilot/AI team is also working on Optimus and (actually smart) summon/autopark, which have end of month deadlines.
The end of the month is not too far away, so time is ticking down to complete these extraordinarily complex and revolutionary projects. Optimus, the highly anticipated humanoid robot, will be revealed at Tesla's AI Day on September 30.
If that's not enough, the team has to improve two aspects of the Autopilot system. Clearly, the CEO is not impressed with the program. It is called Smart Summon, but Musk must not think the title is too accurate as he states "actually smart" in his tweet. In addition, users have pointed out that their Teslas can struggle in parking lots. But, to be fair, who hasn't struggled in parking lots?
When using Smart Summon, Tesla uses your phone's GPS as a target destination. Users can also put in a location; the vehicle is supposed to navigate to that point by steering around obstacles. Smart Summon is to work when the user is within 20 feet of the car. However, the manual states, "You must maintain a clear line of sight between you and Model (S, 3, X, Y) and closely monitor the vehicle and its surroundings at all times."
In addition, the same webpage has six warnings and one caution to users. These include details such as the cameras must be clean, it must be used on paved surfaces, it may not stop for all objects, and it currently needs adequate cell service and GPS data.
Autopark also has several warnings on its user manual webpage. This program assists the driver in finding parking, both perpendicular and parallel. However, unlike Smart Summon, the user must be in the car before letting the Tesla take over.
Navigate without Maps
Musk previously hinted at a major improvement that's coming: the ability for FSD beta to navigate roads with no map data. The vehicle will be able to navigate to a specific GPS point or pinned location (ex: rural roads), says Musk.
Dead Reckoning Navigation (Navigating without GPS)
Elon has also alluded to the fact that Tesla is working on the AI’s ability to complete 'dead reckoning' navigation (navigating only on “inertial measurements, wheel movement & vision”).
Elon gave underground parking garages as an example of where FSD would need this ability to navigate without GPS, map data or cell service. The car will be able to do this by using its last known GPS location and then determining its future location using only a compass, wheel movement and speed.
Many Twitter users were trying to figure out what Musk referred to with these programs. Will the car drop off its passengers and park itself (reverse summon will offer three options)? How will it navigate difficult parking lots with unpredictable pedestrians and drivers?
Elon Musk started Tesla's AI Day 2022 by saying, "I want to set some expectations with respect to our Optimus Robot," just before the doors opened behind him. A robot walked out, waved at the audience, and did a little dance. Admittedly a humble beginning, he explained, "the Robot can actually do a lot more than what we just showed you. We just didn't want it to fall on its face." Musk's vision for the Tesla Robot, "Optimus is going to be incredible in five years, ten years mind-blowing." The CEO said other technologies that have changed the world have plateaued; the Robot is just starting.
Tesla's CEO envisions Optimus eventually being like Commander Data, the android from Star Trek the Next Generation, except it "would be programmed to be less robot-like and more friendly." Undoubtedly there is a long way to go to achieve what Doctor Noonien Soong created in Star Trek TNG. What was demonstrated onstage wasn't at that level, but several videos throughout the presentation highlighted what the Robot is capable of at its very early stage in development. The audience watched the Robot pick up boxes, deliver packages, water plants and work at a station at the Tesla factory in Fremont.
Development over 8 Months
The breakdown of some of the systems of the Tesla Robot
Tesla (Edited by Not a Tesla App)
The first Robot to take the stage at AI Day was not Optimus, but Bumble C, another acknowledgement to The Transformers, as Bumble Bee played a significant role in that franchise. However, Bumble C is far less advanced than Optimus, who did appear later but was on a cart.
Several Tesla engineers took turns on the microphone describing some of the most complex elements of the project that was first announced one year ago. Perhaps the best description of the project was the company moving from building a robot on wheels to a robot on legs. However, that may be oversimplifying. For example, the car has two motors, and the Robot has 28 actuators.
Overall Design and Battery Life
Tesla's brightest demonstrated how the production has come to life over the past eight months. It seems this group of computer masterminds had to become anatomist experts as Tesla took hints from the human body to create a humanoid robot. That is an essential factor in creating Optimus. Everything people interact with is made usable by a human, with two legs, two arms, ten fingers etc. If the Robot differed from what the world is already designed for, everything would have to change. However, recreating the human body and its countless movements would take far too long, so Tesla has stripped it down to less than 30 core movements, not including the hand.
Like the human torso contains the heart, the Robot's chest holds the battery. It's projected that a single charge would provide enough for a full day's work with a 2.3-kilowatt-hour battery. All the battery electronics are integrated into a single printed circuit board within the pack. That technology keeps charge management and power distribution all in one place. Tesla used lessons learned from vehicle and energy production to create the battery allowing for streamlined manufacturing and simple and effective cooling methods.
Tesla showed what the Robot sees, and it looked very familiar. That's because the neural networks are pulling directly from Autopilot. Training data had to be collected to show indoor settings and other products not used with the car. Engineers have trained neural networks to identify high-frequency features and key points within the Robot's camera streams, such as a charging station. Tesla has also been using the Autopilot simulator but has integrated it for use with the Robot programming.
Tesla shows off what the Optimus robot sees
Tesla (Edited by Not a Tesla App)
The torso also contains the centralized computer that Tesla says will do everything a human brain does, such as processing vision data, making split-second decisions based on multi-sensory inputs and supporting communications. In addition, the Robot is equipped with wireless connectivity and audio support. Yes, the Robot is going to have conversations, "we really want to have fun, be utilitarian and also be a friend and hang out with you," said Musk.
Motors Mimic Joints
The 28 actuators throughout the Robot's frame are placed where many joints are in the human body. Just one of those actuators was shown lifting a half-tonne nine-foot concert grand piano. There have been thousands of test models run to show how each motor works with the other and how to effectively operate the most relevant actuators for a task. Even the act of walking takes several calculations that the Robot must make in real-time, not only to perform but also appear natural. The robots will be programmed with a locomotion code; the desired path goes to the locomotion planner, which uses trajectories to state estimations, very similar to the human vestibular system.
Human hands can move 300 degrees per second and have tens of thousands of tactile sensors. Hands can manipulate anything in our daily lives, from bulky, heavy items to something delicate. Now Tesla is recreating that with Optimus. Six actuators and 11 degrees of freedom are incorporated into the robot hand. It has an in-hand controller that drives the fingers and receives sensory feedback. The fingers have metallic tendons to allow for flexibility and strength. The hands are being created to allow for a precision grip of small parts and tools.
Responsible Robot Safety
Musk wanted to start AI day with the epic opening scene from Terminator when a robot crushed a skull. He has heard the fears and people warning, "don't go down the terminator path," but the CEO said safety is a top priority. There are safeguards in place, including designs for a localized control ROM that would not be connected to the internet that can turn the Robot off. He sees this as a stop button or remote control.
Musk said the development of Optimus may broaden Tesla's mission statement to include "making the future awesome." He believes the potential is not recognized by most, and it "really boggles the mind." Musk said, "this means a future of abundance. There is no poverty. You can have whatever you want in terms of products and services. It really is a fundamental transformation of civilization as we know it." All of this at a price predicted to be less than $20,000 USD.
TeslaFi is a service that logs your drives and charging sessions so that you can later refer back to them. We highly recommend checking them out if you use your car for business trips and would like to keep track of reimbursements, if you like to see how much you spend on charging or if you just love statistics. View their about us page and see everything they have to offer!
DIMO is building a web3, user-owned network dedicated to supporting the next generation of mobility infrastructure. As a user, you can start today by accessing the best connected vehicle experience via the DIMO Mobile App. It works for nearly any vehicle and across any OEM; users are in control of their data and their DIMO wallet is a conduit to other apps and services, saving time and money. Learn more