Breaking Down Tesla’s Autopilot vs. Wall “Wile E. Coyote” Video

By Not a Tesla App Staff
Mark Rober

Mark Rober, of glitter bomb package fame, recently released a video titled Can You Fool A Self-Driving Car? (posted below). Of course, the vehicle featured in the video was none other than a Tesla - but there’s a lot wrong with this video that we’d like to discuss.

We did some digging and let the last couple of days play out before making our case. Mark Rober’s Wile E. Coyote video is fatally flawed.

The Premise

Mark Rober wanted to prove whether or not it was possible to fool a self-driving vehicle, using various test scenarios. These included a wall painted to look like a road, low-lying fog, mannequins, hurricane-force rain, and bright beams.

All of these individual “tests” had their own issues - not least because Mark didn’t adhere to any sort of testing methodology, but because he was looking for a result - and edited his tests until he was sure of it.

Interestingly, many folks on X were quick to spot that Mark had been previously sponsored by Google to use a Pixel phone - but was using an iPhone to record within the vehicle - which he had edited to look like a Pixel phone for some reason. This, alongside other poor edits and cuts, led many, including us, to believe that Mark’s testing was edited and flawed.

Flaw 1: Autopilot, Not FSD

Let’s take a look at the first flaw. Mark tested Autopilot - not FSD. Autopilot is a driving aid for lane centering and speed control - and is not the least bit autonomous. It cannot take evasive maneuvers outside the lane it is in, but it can use the full stable of Tesla’s extensive features, including Automatic Emergency Braking, Forward Collision Warnings, Blind Spot Collision Warnings, and Lane Departure Avoidance.

On the other hand, FSD is allowed and capable of departing the lane to avoid a collision. That means that even if Autopilot tried to stop and was unable to, it would still impact whatever obstacle was in front of it - unlike FSD.

As we continue with the FSD argument - remember that Autopilot is running on a 5-year-old software stack that hasn’t seen updates. Sadly, this is the reality of Tesla not updating the Autopilot stack for quite some time. It seems likely that they’ll eventually bring a trimmed-down version of FSD to replace Autopilot, but that hasn’t happened yet.

Mark later admitted that he used Autopilot rather than FSD because “You cannot engage FSD without putting in a destination,” which is also incorrect. It is possible to engage FSD without a destination, but FSD chooses its own route. Where it goes isn’t within your control until you select a destination, but it tends to navigate through roads in a generally forward direction.

The whole situation, from not having FSD on the vehicle to not knowing you can activate FSD without a destination, suggests Mark is rather unfamiliar with FSD and likely has limited exposure to the feature.

Let’s keep in mind that FSD costs $99 for a single month, so there’s no excuse for him not using it in this video.

Flaw 2: Cancelling AP and Pushing Pedals

Many people on X also followed up with reports that Mark was pushing the pedals or pulling on the steering wheel. When you tap on the brake pedal or pull or jerk the steering wheel too much, Autopilot will disengage. For some reason, during each of his “tests,” Mark closely held the steering wheel of the vehicle.

This comes off as rather odd - at the extremely short distances he was enabling AP at, there wouldn’t be enough time for a wheel nag or takeover warning required. In addition, we can visibly see him pulling the steering wheel before “impact” in multiple tests.

Over on X, techAU breaks it down excellently on a per-test basis. Mark did not engage AP in several tests, and he potentially used the accelerator pedal during the first test - which means that Automatic Emergency Braking is overridden. In another test, Mark admitted to using the pedals.

Flaw 3: Luminar Sponsored

This video was potentially sponsored by a LiDAR manufacturer - Luminar. Although Mark says that this isn’t the case. Interestingly, Luminar makes LiDAR rigs for Tesla - who uses them to test ground truth accuracy for FSD. Just as interesting, Luminar’s Earnings Call was also coming up at the time of the video’s posting.

Luminar had linked the video at the top of their homepage but has since taken it down. While Mark did not admit to being sponsored by Luminar, there appear to be more distinct conflicts of interest, as Mark’s charity foundation has received donations from Luminar’s CEO.

Given the positivity of the results for Luminar, it seems that the video had been well-designed and well-timed to take advantage of the current wave of negativity against Tesla, while also driving up Luminar’s stock.

Flaw 4: Vision-based Depth Estimation

The next flaw to address is the fact that humans and machines can judge depth using vision. On X, user Abdou ran the “invisible wall” through a monocular depth estimation model (DepthAnythingV2) - one that uses a single image with a single angle. This fairly simplified model can estimate the distance and depth of items inside an image - and it was able to differentiate the fake wall from its surroundings easily.

Tesla’s FSD uses a far more advanced multi-angle, multi-image tool that stitches together and creates a 3D model of the environment around it and then analyzes the result for decision-making and prediction. Tesla’s more refined and complex model would be far more able to easily detect such an obstacle - and these innovations are far more recent than the 5-year-old Autopilot stack.

While detecting distances is more difficult in a single image, once you have multiple images, such as in a video feed, you can more easily decipher between objects and determine distances by tracking the size of each pixel as the object approaches. Essentially, if all pixels are growing at a constant rate, then that means it’s a flat object — like a wall.

Case in Point: Chinese FSD Testers

To make the case stronger - some Chinese FSD testers took to the streets and put up a semi-transparent sheet - which the vehicle refused to drive through or drive near. It would immediately attempt to maneuver away each time the test was engaged - and refused to advance with a pedestrian standing in the road.

Thanks to Douyin and Aaron Li for putting this together, as it makes an excellent basic example of how FSD would handle such a situation in real life.

Flaw 5: The Follow-Up Video and Interview

Following the community backlash, Mark released a video on X, hoping to resolve the community’s concerns. However, this also backfired. It turned out Mark’s second video was of an entirely different take than the one in the original video - this was at a different speed, angle, and time of initiation.

Mark then followed up with an interview with Philip DeFranco (below), where he said that there were multiple takes and that he used Autopilot because he didn’t know that FSD could be engaged without a destination. He also answered here that Luminar supposedly did not pay him for the video - even with their big showing as the “leader in LiDAR technology” throughout the video.

Putting It All Together

Overall, Mark’s video was rather duplicitous - he recorded multiple takes to get what he needed, prevented Tesla’s software from functioning properly by intervening, and used an outdated feature set that isn’t FSD - like his video is titled.

Upcoming Videos

Several other video creators are already working to replicate what Mark “tried” to test in this video.

To get a complete picture, we need to see unedited takes, even if they’re included at the end of the video. The full vehicle specifications should also be disclosed. Additionally, the test should be conducted using Tesla’s latest hardware and software—specifically, an HW4 vehicle running FSD v13.2.8.

In Mark’s video, Autopilot was engaged just seconds before impact. However, for a proper evaluation, FSD should be activated much earlier, allowing it time to react and, if capable, stop before hitting the wall.

A wave of new videos is likely on the way—stay tuned, and we’ll be sure to cover the best ones.

Tesla May Add Lumbar Support to Driver Profiles, Offer Turn Signal Stalk Retrofit

By Karan Singh
Not a Tesla App

Tesla’s Vice President of Vehicle Engineering, Lars Moravy, recently took to X and opened the floor for user input. There, he asked the community for features and improvements they’d like to see to make Teslas better heading into 2026.

This post generated thousands of suggestions - and we recapped the best of them. There were also a few that Lars responded to, giving owners hope for some much-requested future changes, so let’s take a look at what may be coming

Lumbar Profile Support

Today, lumbar support is one of the few items that is not saved in the Tesla profile. That means if you have multiple drivers who use the same vehicle, you’re often left adjusting this setting manually, as it retains the setting that was last used. One community member suggested saving your lumbar setting to your profile just like Tesla does for other seat settings.

Lars said making this change seems doable, but it’ll take some engineering magic. The lumbar support isn’t tied to an absolute sensor like the other seat settings. This suggests that Tesla does not have an exact value to save, as it does with other seat functions, but Lars believes Tesla can find a way to save lumbar preferences. Tesla could potentially time how long the motor runs to get to the user’s lumbar setting and save this value.

With that said, it seems the vehicle engineering team may take a look at this one, and we may see it included in a future update.

Model 3 Signal Stalk Retrofit

The move away from traditional stalks in favor of the steering wheel buttons on the Refreshed Model 3 has been a point of debate. While the author is squarely in the camp of steering wheel buttons (at least with the Cybertruck), many dissent and say that the buttons on the Refreshed 3 aren’t as satisfying or easy to use.

Many other drivers also prefer the tactile feel and muscle memory of a physical stalk for signaling. Tesla appears to favor stalks, as they retained the turn signal stalk with the new Model Y. There are also rumors that Tesla is going to reintroduce the turn signal stalk to the Model 3.

If Tesla adds stalks back to the new Model 3, current 2024+ Model 3 owners are still left without stalks. However, a user suggested adding stalks as a retrofit option. Lars said that he would try to consider a retrofitted signal stalk for the Refreshed Model 3, similar to the simplified version in the Refreshed Model Y.

While less definitive than the lumbar support response, it appears that Tesla may at least consider offering a stalk retrofit for the new Model 3. If you’re a lover of signal stalks and can’t wait for Tesla to get an official one - we recommend the Enhauto S3XY Stalks, which are customizable and feel very close to Tesla’s original fit and finish.

With that said, it’s nice to see Tesla incorporating more community feedback into its vehicle design these days. Perhaps one day, they’ll address the infamous auto wipers. They have gotten better, but they’re still not as reliable as what’s available in most other vehicles. With that said, we look forward to the changes that will emerge from these recent conversations.

Tesla to Issue TCU Fix That Prevents Vehicles From Sleeping in Update 2025.14.6

By Karan Singh
Not a Tesla App

Sometimes, even with Tesla’s intensive bug-testing regime, bugs manage to make it out into the wild. In this particular case, a European user (@darkwaffle48484 on X) noticed that their 2024 Model 3 was using up more battery than normal while parked. Normally, they noted that the vehicle would lose about 1-2% per week; however, recently, they noticed much larger drops of 3-4% per night.

They monitored their Tesla widget and noticed the car wasn’t entering deep sleep. The widget consistently showed a recent connection time—usually within the past 45 minutes.

Fix Inbound

After discussing the issue with other Tesla owners and realizing it was somewhat widespread, they contacted Tesla Service. The service team confirmed that it was a firmware bug affecting the Telematics Control Unit (TCU), which prevented the vehicle from entering deep sleep mode.

The TCU is essentially the communications hub of your Tesla - and is mounted on the ceiling of newer vehicles such as the Model 3 and the new Model Y. It enables cellular and location services (via GPS) and also handles Wi-Fi and Bluetooth services. Tesla Service stated that this bug is planned to be fixed in update 2025.14.6, although the exact version number could change. However, they confirmed that they are aware of the issue and it is being addressed.

@darkwaffle48484

When the user reached out to Tesla Service, Tesla Service responded with the following (translated from Dutch):

“It has been confirmed that this is a firmware bug. The fix is in one of the next updates. Currently, it is planned for 2025.14.6 (subject to change). Do you have any more questions?”

Potentially Region-Specific

This bug could potentially be region-specific. TCUs often require specific hardware components, such as modems, as well as specific firmware versions that support different regions and cellular providers. These enable Tesla to comply with local cellular standards and regulations and ensure that your vehicle can connect to the networks available in that particular region.

At this point, it’s not clear when the fix will roll out, but given that update 2025.14.1 has practically stopped rolling out, Tesla may be waiting to resume the rollout with update 2025.14.3 or this 2025.14.6 version.

If you’ve noticed this issue and are in a non-European nation, let us know.

Latest Tesla Update

Confirmed by Elon

Take a look at features that Elon Musk has said will be coming soon.

More Tesla News

Tesla Videos

Latest Tesla Update

Confirmed by Elon

Take a look at features that Elon Musk has said will be coming soon.

Subscribe

Subscribe to our weekly newsletter