Will Tesla add a bird's eye view feature to newer vehicles?
@zzapcars
Tesla’s forthcoming hardware 4.0 computer reveals that Tesla will be able to connect up 12 cameras with its new FSD hardware suite, up from the current nine it uses today.
Earlier this week, Tesla owner and hacker Greentheonly leaked information and pictures of Tesla’s anticipated hardware 4.0. Green confirmed some big and subtle improvements to the hardware.
12 Cameras in Hardware 4
The new hardware's main board will accept up to 12 cameras, with one of them labeled as a spare. Tesla currently uses nine cameras in its current hardware and is expected to reduce the front-facing cameras by one. Based on this information and the labels on the camera connectors, it appears that Tesla will include three additional cameras with FSD hardware 4.0.
When Greentheonly revealed these hardware 4 details, Twitter user StayLameBro1 inquired about where the new cameras will be placed around the vehicle. “There are seemingly 3 bumper cameras,” Green responds. “One up front the other two are left and right so in the corners somewhere.”
Green adds that the addition of bumper cameras and HD radar will remove blind spots. This is a big deal alone. Lastly, Green added that these additional cameras could allow Tesla to incorporate a real bird’s eye view into its vehicles, a feature many Tesla owners have been requesting.
There's currently a large blind spot in front of the vehicle
Munro
Bird’s Eye View
Bird's eye view is a feature that many other automakers have had for a while now, including Toyota, Honda, BMW, and Lucid. It allows for drivers to see a top-down view of their vehicle and surroundings, increasing safety and spatial awareness.
Last fall, Tesla announced they’re transitioning their fleet to their own Tesla Vision. The company added that vehicles built in October 2022 and beyond would no longer include ultrasonic sensors (USS). This sent confusion around the Tesla community given the blind spots, but hardware 4 will seemingly address these concerns.
The removal of the USS saves Tesla approximately $114 per vehicle. This is in line with “Project Highland,” a cost-cutting project for the upcoming revamped Model 3.
Tesla CEO Elon Musk told investors that upgrading hardware 3 to hardware 4 is difficult and expensive.
“The cost and difficulty of retrofitting hardware 3 with hardware 4 is quite significant,” Musk said. “So, it would not be, I think, economically feasible to do so.”
After the leak of hardware 4, we now begin to understand Musk's statement. We're not only talking about a new computer with hardware 4, but higher-resolution cameras, additional cameras, possibly a new bumper design and an HD radar unit.
Musk has also reiterated that hardware 3 will be capable of driving safer than a human, however, it's not clear under which constraints that would be, such as specific roads and weather conditions, or whether a human driver would always be required to be attentive.
Update: We have now confirmed that FSD Hardware 4.0 will not include additional cameras in the bumper or elsewhere, which means that the vehicle will still have a blind spot. However, we have been told that the 'bumper' camera connectors on HW4.0 are for future expansion, but will not be used initially.
Subscribe
Subscribe to our newsletter to stay up to date on the latest Tesla news, upcoming features and software updates.
Tesla recently showed off a demo of Optimus, its humanoid robot, walking around in moderately challenging terrain—not on a flat surface but on dirt and slopes. These things can be difficult for a humanoid robot, especially during the training cycle.
Most interestingly, Milan Kovac, VP of Engineering for Optimus, clarified what it takes to get Optimus to this stage. Let’s break down what he said.
Optimus is Blind
Optimus is getting seriously good at walking now - it can keep its balance over uneven ground - even while walking blind. Tesla is currently using just the sensors, all powered by a neural net running on the embedded computer.
Essentially, Tesla is building Optimus from the ground up, relying on as much additional data as possible while it trains vision. This is similar to how they train FSD on vehicles, using LiDAR rigs to validate the vision system’s accuracy. While Optimus doesn’t have LiDAR, it relies on all those other sensors on board, many of which will likely become simplified as vision takes over as the primary sensor.
Today, Optimus is walking blind, but it’s able to react almost instantly to changes in the terrain underneath it, even if it falls or slips.
What’s Next?
Next up, Tesla AI will be adding vision to Optimus - helping complete the neural net. Remember, Optimus runs on the same overall AI stack as FSD - in fact, Optimus uses an FSD computer and an offshoot of the FSD stack for vision-based tasks.
Milan mentions they’re planning on adding vision to help the robot plan ahead and improve its walking gait. While the zombie shuffle is iconic and a little bit amusing, getting humanoid robots to walk like humans is actually difficult.
There’s plenty more, too - including better responsiveness to velocity and direction commands and learning to fall and stand back up. Falling while protecting yourself to minimize damage is something natural to humans - but not exactly natural to something like a robot. Training it to do so is essential in keeping the robot, the environment around it, and the people it is interacting with safe.
We’re excited to see what’s coming with Optimus next because it is already getting started in some fashion in Tesla’s factories.
In a relatively surprising move, GM announced that it is realigning its autonomy strategy and prioritizing advanced driver assistance systems (ADAS) over fully autonomous vehicles.
GM is effectively closing Cruise (autonomous) and focusing on its Super Cruise (ADAS) feature. The engineering teams at Cruise will join the GM teams working on Super Cruise, effectively shuttering the fully autonomous vehicle business.
End of Cruise
GM cites that “an increasingly competitive robotaxi market” and “considerable time and resources” are required for scaling the business to a profitable level. Essentially - they’re unable to keep up with competitors at current funding and research levels, putting them further and further behind.
Cruise has been offering driverless rides in several cities, using HD mapping of cities alongside vehicles equipped with a dazzling array of over 40 sensors. That means that each cruise vehicle is essentially a massive investment and does not turn a profit while collecting data to work towards Autonomy.
Cruise has definitely been on the back burner for a while, and a quick glance at their website - since it's still up for now - shows the last time they officially released any sort of major news packet was back in 2019.
Competition is Killer
Their current direct competitor - Waymo, is funded by Google, which maintains a direct interest in ensuring they have a play in the AI and autonomy space.
Interestingly, this news comes just a month after Tesla’s We, Robot event, where they showed off the Cybercab and the Robotaxi network, as well as plans to begin deployment of the network and Unsupervised FSD sometime in 2025. Tesla is already in talks with some cities in California and Texas to launch Robotaxi in 2025.
GM Admits Tesla Has the Right Strategy
As part of the business call following the announcement, GM admitted that Tesla’s end-to-end and Vision-based approach towards autonomy is the right strategy. While they say Cruise started down that path, they’re putting aside their goals towards fully autonomous vehicles for now and focusing on introducing that tech in Super Cruise instead.
NEWS: GM just admitted that @Tesla’s end-to-end approach to autonomy is the right strategy.
“That’s where the industry is pivoting. Cruise had already started making headway down that path. We are moving to a foundation model and end-to-end approach going forward.” pic.twitter.com/ACs5SFKUc3
With GM now focusing on Super Cruise, they’ll put aside autonomy and instead focus solely on ADAS features to relieve driver stress and improve safety. While those are positive goals that will benefit all road users, full autonomy is really the key to removing the massive impact that vehicle accidents have on society today.
In addition, Super Cruise is extremely limited, cannot brake for traffic controls, and doesn’t work in adverse conditions - even rain. It can only function when lane markings are clear, there are no construction zones, and there is a functional web connection.
The final key to the picture is that the vehicle has to be on an HD-mapped and compatible highway - essentially locking Super Cruise to wherever GM has time to spend mapping, rather than being functional anywhere in a general sense, like FSD or Autopilot.
Others Impressed - Licensing FSD
Interestingly, some other manufacturers have also weighed into the demise of Cruise. BMW, in a now-deleted post, said that a demo of Tesla’s FSD is “very impressive.” There’s a distinct chance that BMW and other manufacturers are looking to see what Tesla does next.
BMW chimes in on a now-deleted post. The Internet is forever, BMW!
Not a Tesla App
It seems that FSD has caught their eyes after We, Robot - and that the demonstrations of FSD V13.2 online seem to be the pivot point. At the 2024 Shareholder Meeting earlier in the year, Elon shared the fact that several manufacturers had reached out, looking to understand what was required to license FSD from Tesla.
There is a good chance 2025 will be the year we’ll see announcements of the adoption of FSD by legacy manufacturers - similar to how we saw the surprise announcements of the adoption of the NACS charging standard.