Apple Music may be available in Tesla’s new holiday update, but don’t expect the lossless audio quality that Apple Music subscribers enjoy… at least not yet.
Tesla owner and software developer Dan Burkland recently performed some tests on the audio quality that Tesla’s in-car streaming services TIDAL, Spotify, and now Apple Music stream at in the vehicle.
Burkland tested TIDAL previously on a different Tesla software version, but with Tesla’s ever-changing software and the rollout of the holiday update, he chose to run the tests again.
Setup and Songs Used to Test
He connected his Model Y to his home's WiFi network and used a DHCP reservation, which allowed him to have the vehicle use a specific IP address. He then installed ntopng on his OPNsense firewall to monitor traffic statistics for the vehicle. After zeroing out the host stats for the Model Y, he tested a total of nine songs, including “Purple Rain” by Prince, “Foreplay” by Boston, and “Kashmir” by Led Zeppelin.
Results
The results of Burkland’s tests concluded that TIDAL still offers the best listening experience. The average bitrate for TIDAL was ~1165 kbps. This isn’t entirely too shocking, as TIDAL has always championed the highest-quality audio streaming experience.
Surprisingly, Spotify’s audio quality came in ahead of Apple Music to nab second place. Burkland’s tests showed Spotify streaming at an average bitrate of ~157 kbps, while Apple Music came in at a subpar ~118 kbps.
Burkland added that he believes Apple Music is limiting bitrate for the in-car app, but a future update to Tesla’s software will hopefully resolve this. If Tesla can enable lossless streaming for Apple Music, it’ll give TIDAL a run for its money in high-fidelity streaming via the in-car app.
Check out some of Dan’s test results below, or for a complete list view his Reddit thread.
During their tests, they observed the same pattern across all songs, citing that a bunch of data rolled in at the beginning of the tests and then slowly trickled through. The low data rate for Apple Music appears to align with the company’s HE-AAC codec at 64 kbps.
“It appears to buffer most or even all of the song, then pause between tracks to do it again,” writes u/OverlyOptimisticNerd. “On average, I saw ~2MB per track, with ~1.7MB during the initial burst and ~0.3MB throughout the track. This is consistent with the HE-AAC standard, as most of these songs were a little over 3 minutes in length.”
While Apple Music may come in at the lowest average bitrate of all three services tested, it's important to note that it doesn't necessarily mean it has the lowest quality. Audio quality comes down to a variety of factors, some of which are, the bitrate, whether it's a variable rate, and the efficiency of the audio codec used.
Apple Music's HE-ACC codec is optimized for low-bandwidth applications meaning that it can outperform an ACC-encoded file in lower-bandwidth situations. In the real-world Apple Music in your Tesla should sound very similar to streaming music from Spotify, but not as good as TIDAL's offerings.
Subscribe
Subscribe to our newsletter to stay up to date on the latest Tesla news, upcoming features and software updates.
Tesla’s FSD has made some truly incredible strides since V11, and since FSD V12.5, the experience has been hands-free for vehicles with a cabin camera.
However, a persistent point of frustration for many users is the strictness of the Driver Monitoring System (DMS), often referred to as the “nag.” In a recent interaction on X, Mike P detailed his grievances about how strict the DMS was.
This post drew a response from Elon Musk, who said, “You’re right.” Just a few days and a relatively unassuming point release later, Tesla has already decided to take action to improve its DMS.
The core issue here, which many who use FSD can attest to, isn’t about wanting to be irresponsible. Instead, it is about the current system’s sensitivity. The DMS can feel overly punitive for brief, normal interactions with the vehicle’s center display.
User Experience Woes
Mike P’s experience was common - you can’t even glance at the display to change the song or add a nav stop without the DMS warning you to pay attention.
If you continue, then you risk receiving a FSD strike. This leads to most drivers disabling FSD and typing their destination in while manually driving. For the casual observer, you can tell that it is clearly far more dangerous.
This highlights a safety paradox: a system designed to ensure attentiveness can sometimes lead to less safe workarounds. One must acknowledge that Tesla is in an odd position, being incredibly cautious about safety and ensuring it stays within NHTSA guidelines. However, the nag today is overkill in some situations, such as glancing at the center screen.
Tesla Confirms Change
Musk’s relatively concise answer resonated with his previous outlook on the matter. During Tesla’s Q1 2025 Earnings Call, he acknowledged that the DMS can be too strict and mentioned that Tesla is actively looking into ways to loosen those restrictions. He also pointed out the irony between the current system encouraging users to disengage FSD for simple tasks, only to re-engage it moments later - a less-than-safe cycle.
In a post on X, Ashok Elluswamy, Tesla’s VP of Autopilot AI, delivered welcome news. He confirmed that the latest FSD update, V13.2.9, includes a loosening of the cabin camera nag. This is an undocumented change, and one that we’re very excited to see.
This undocumented change is the latest step in Tesla’s overall plan forward Unsupervised FSD, which would drop the DMS completely. Previous updates, like the shift to vision-based driver attention monitoring in V12.4 and V12.5, aimed to balance safety with user experience.
What Does This Mean?
While the full extent of changes in V13.2.9 will become clearer as Software Update 2025.14.6 rolls out to more FSD users, the confirmation of loosened cabin camera nag suggests a few things.
This likely means greater tolerance for brief glances at the screen for essential tasks, whether it be adjusting climate settings, inputting a nav destination, or changing the current song. It could also include a potentially more forgiving threshold for looking away, especially in low-speed scenarios. The DMS does not ding you for using the display or looking away while the vehicle is waiting at a red light today, but Tesla could expand this to driving under 10 mph (16 km/h).
Ashok Elluswamy, Tesla's Vice President of Autopilot and AI Software, recently discussed Tesla's artificial intelligence programs' current state and future ambitions. He covered FSD and then extended it to the broader topics of robotics and Artificial General Intelligence (AGI).
Journey to Truly Autonomous Driving
At the core of Tesla’s AI efforts lies the quest for fully autonomous vehicles. Ashok reiterated the long-term vision where, eventually, all newly manufactured cars are expected to be self-driving, with older, human-driven cars potentially becoming items for specialized hobbies or unique purposes.
However, he did acknowledge that the current advanced driver assistance systems (ADAS), including Tesla’s own FSD, require better reliability before the human can be completely removed from the equation.
The development process, he emphasized, is fundamentally rooted in machine learning rather than traditional programming. A crucial aspect of this is that AI is consistent across every vehicle, learning collectively from the fleet’s experiences rather than being unique to each car.
Progress in AI is continuous.
Safety and reliability remain Tesla’s focus for FSD. Now, with Tesla just weeks away from launching its Robotaxi Network in Austin, Texas, this is more true than ever, as any accidents could cause a delay in the program’s expansion or stop the program entirely.
No LiDAR
Ashok confirmed that Tesla still has no interest in LiDAR while discussing Tesla's vision-based sensor suite. He reiterated that cost and scalability remain key concerns with LiDAR, adding that its perceived usefulness diminishes as vision-based systems continue to improve.
Beyond the Road: FSD and Robotics
Ashok described Tesla’s AI network poetically - a “digital living being.” This emphasizes the organic way FSD absorbs information from the environment and learns from it. But FSD isn’t just for cars. Tesla uses FSD, as well as the same AI4 hardware from its vehicles, for its humanoid robot, Optimus.
Ashok expects that there will be a tremendous wave in robotics over the next 10 to 20 years. A key part of this will be the development of humanoid robots, which he believes will eventually be capable of complex industrial and domestic tasks, interacting with natural language, likely by 2035.
This recent surge in AI capabilities has been heavily driven by advancements in deep learning and the availability of massive computing power. Tesla is making heavy investments in both software and hardware. It recently started construction of its Cortex 2.0 Supercomputer cluster at Giga Texas.
Envisioning Sustainable Abundance & AGI
The conversation also covered the topics of Artificial General Intelligence. Ashok offered a pretty bold prediction that AGI will arrive in as little as the next 10 years, based on the rate of advancement that he’s seen so far. He further projected that AI-based software could become capable of performing most human tasks, whether spreadsheets or even robotic athletics, within the next 15 years.
This technological leap, he believes, ties into Tesla’s newer mission statement of sustainable abundance. Sustainable abundance is where the combination of intelligent machines and effective robotics helps to move greater portions of society away from poverty. This has become Tesla’s guiding philosophy since the 2025 All-Hands Meeting earlier this year.
Sustainable abundance should be a win-win scenario for all involved, helping reshape both production and creative industries to help humans do what they want to do rather than what they have to do.
Future of Mobility
As FSD and other AGI tech mature, Ashok believes that all cars being manufactured by 2035 will become autonomous. By then, the very concept of car ownership may change and transform. Owning a car would be a more “premium experience,” as the convenience and efficiency of self-driving vehicles might make personal ownership less of a necessity for many people. This shift would also necessitate infrastructure improvements to accommodate potentially increased vehicle usage.
We took a look at what the future may look like when autonomous vehicles become commonplace. It’ll have a drastic effect on our society, as parking lots will need to be a fraction of the size they are today, drop-off and loading zones will need to be bigger, and, for the most part, road signs may no longer be needed.
Will need this big time in the future. With autonomous vehicles we'll have affordable premium transport for everyone. This will likely increase traffic due to the increased usage, even though each vehicle is much more efficiently utilized. https://t.co/xvdvmxmzxd
Touching on the Indian vehicle market, Ashok noted that EVs, especially when combined with technologies like FSD, are well suited to the typical travel patterns in India and could make a big difference. With Tesla putting its eyes on a potential factory expansion in the coming years in India, there’s a lot riding on Tesla being able to take on the challenge of Indian roadways, where traffic laws are not enforced and well known.
Ashok’s interview was a fantastic look into what he believes will be next for Tesla - and he left with some parting advice for the next generation of engineers.
Master core concepts and leverage the wealth of online resources available. There is an emphasis on talent and innovation over traditional corporate hierarchies, and don’t forget your priorities: work and family.
You can watch the full interview here. Closed captioning is available.