What occurs when our Tesla Model Y’s cameras lose visibility? The results aren’t good.
Tesla, especially under the leadership of Elon Musk, has a keen focus on cameras. According to the New York Times, Musk has urged his team that if humans can drive using just their eyes, vehicles should be able to do the same. Unfortunately, this isn’t the reality—when the cameras fail to see, the technology that depends on them falters or halts completely. Experiencing even a temporary loss of key features in our 2023 Tesla Model Y Long Range is understandably exasperating, leading us to reevaluate their value and our choices that led us to this point.
Does it have all the necessary hardware?
Initially, Tesla promoted its new models as fully equipped with all the hardware required for achieving genuine autonomous driving, or “Full Self-Driving,” as the company refers to it. They claimed that reaching this goal was merely a matter of sufficiently training the software. However, Tesla has since moved away from this claim.
At the same time, Tesla was busy disabling and even removing hardware in its vehicles. The forward-facing radar was switched off in existing models, and production of it ceased in newer ones. Subsequently, ultrasonic parking sensors were also removed. In both instances, Tesla and Musk argued that these sensors were redundant due to advancements in software, maintaining that cameras alone suffice.
Contrarily, every other company we’ve conversed with in the autonomous vehicle domain disagrees. Outside of Tesla, it is widely believed that utilizing multiple sensor types—each offering unique advantages and limitations, along with overlapping and redundant information—is essential for achieving true autonomous driving. The reasoning is that relying on a single type of sensor can easily lead to failure, obstruction, or missing vital data, leaving the entire system momentarily blind.
Nature offers proof
The drawbacks of Tesla’s strategy and the insight of its autonomous vehicle rivals have become clear to us on numerous occasions—most frequently during the early morning hours.
In our area, heavy fog and morning mist are common from late fall through early spring, often condensing on the car’s windows. Although defrosters or wipers can clear the windows, the same doesn’t hold for camera lenses and their protective coverings. When these lenses become fogged, the only solution is for someone to walk around the vehicle and wipe them clean. If this is overlooked before driving, you’ll soon find that Park Assist, the camera-based parking assistant of Tesla, is “degraded” and unreliable due to obscured cameras.
This annoyance may seem minor, but it’s frustrating to know that ultrasonic parking sensors, which don’t face this issue (despite their own problems), were once a standard feature in these cars.
The issue extends beyond manual parking as well. Both Tesla’s self-parking and “Summon” features also depend on the cameras; thus, if you’re expecting your vehicle to autonomously navigate out of a parking spot or come to you, you might be left disappointed.
This situation highlights a broader issue: Tesla has not devised a solution for cleaning the cameras when they’re fogged or dirty, leaving the responsibility solely on the driver. While the company did try to position the cameras where they would be less likely to become obstructed, once that happens, the car is helpless. Most other vehicle manufacturers do not universally install camera washers either, but they also don’t rely solely on cameras for their vehicles to function as intended.
When cameras are obstructed, it’s one thing in the safety of your driveway, but it poses a far greater risk in hazardous weather conditions. In heavy fog or rain, the failure of the cameras to see can disable the “Full Self-Driving (Supervised)” system, along with Autosteer and Autopilot functions, all in the name of safety. Just a little rain can neutralize the car’s hallmark $15,000 tech feature (as of when we bought it). While we appreciate the vehicle prioritizing safety, it’s frustrating to recognize the preventive measures that could have been employed were overlooked.
It’s worth mentioning that while all aspiring autonomous vehicles face challenges in adverse weather, Teslas are particularly vulnerable because they solely depend on cameras. Competing brands tackle such issues with alternative sensors, like radar, which continue to function in fog conditions. Recent technological advancements also show lidar, which Musk frequently downplays, performs better in fog compared to cameras. Although radar and lidar still face icing challenges, heating elements have been integrated into vehicles using those systems to melt snow and ice, keeping sensors clear.
Why does Tesla avoid using more sensors?
Disregard Musk’s analogy equating eyes to cameras; experts studying the human eye and visual processing assert that cameras and computers function quite differently. The difference lies not only in how they gather light but also in their processing methods, with little comparison beyond the surface level.
Musk’s primary argument against additional sensors revolves around cost. He has long deemed lidar too pricey. This perspective extends beyond lidar; cutting radar (which can cost three times more than a camera) and ultrasonic sensors from the lineup reduces Tesla’s material, research, and assembly expenses. By eliminating the need to design and create these parts, integrate them, train the computer for data fusion from different sensor types, and add complexity to the assembly process, the company potentially saves substantial amounts of money. However, if cameras could truly replicate the capabilities of these sensors, it would be a savvy business strategy. Unfortunately, Tesla’s real-world tests have not proven this to be the case.
Tesla’s Changing Perspective
Recent developments suggest that Tesla’s viewpoint might be shifting. Various legal battles and investigations have revealed dissent among Tesla engineers regarding the company’s marketing assertions. When faced with scrutiny, the engineers had to retract statements concerning the performance of their cameras and software and even acknowledged that some promotional videos exaggerated the capabilities of the system. In the latest legal case, Tesla’s lawyers argued in court that lidar is essential for achieving self-driving capabilities, admitting that current vehicles cannot operate autonomously without it.
Additionally, documents submitted to government regulators indicate that Tesla has requested permission to reintroduce radar technology in its vehicles. (Since radar operates on radio frequencies, it must comply with Federal Communications Commission regulations.) Furthermore, the company has recently acquired several hundred lidar units, likely for research and development, suggesting a potential shift in its stance on this technology as well.
Current Situation
It appears we might have to deal with the status quo. Tesla will continue updating its software and delivering more over-the-air (OTA) updates to enhance vehicle functionality. Although the company offers owners of certain older models an upgrade option for their car’s computer at an additional cost, it does not retrofit components like radar systems or cameras. This is just part of the risk associated with purchasing a Tesla — the price, hardware, and features could change unexpectedly.
Explore Our Long-Term Review of the 2023 Tesla Model Y Long Range:
- We Test Drove a 2023 Tesla Model Y Long Range for a Year
- The Differences with Supercharging
- How Much Can You Tow with a Tesla?
- Impact of the Tesla Autosteer Recall: Not Much Changed
- Are the Third-Row Seats in the Tesla Model Y Truly Worth It?