Hurricane Milton Approaches Tampa: Live Coverage of Landfall Expected Tonight

Hurricane Milton surging toward Tampa; landfall likely tonight: Live updates TAMPA, Fla. − Highways grew clogged, gas stations were running out of fuel and stores were stripped of necessities as Hurricane Milton roared toward Florida's beleaguered west coast on Wednesday, a "catastrophic" behemoth on a collision course with one of the state's most densely populated
HomeBusinessAssessing the Reality of Tesla's Full Self-Driving Capabilities: A Closer Look at...

Assessing the Reality of Tesla’s Full Self-Driving Capabilities: A Closer Look at Independent Evaluations

 

 

How reliable is Tesla’s Full Self-Driving feature? Recent tests show concerns


AMCI Testing reveals Tesla’s FSD software requires intervention every 13 miles.

 

With Tesla’s highly anticipated RoboTaxi presentation just around the corner, news from AMCI Testing, an independent research firm, casts a shadow over the event. AMCI claims to have conducted a thorough real-world examination of Tesla’s Full Self Driving (FSD) software, a key technology for the upcoming RoboTaxi, and they report unsettling findings.

In their test, which spanned over 1,000 miles, AMCI characterized the performance of Tesla’s FSD software as “questionable.” This isn’t Tesla’s first time facing scrutiny over its FSD. Over the years, the automaker has encountered numerous criticisms ranging from misleading advertisements flagged by the California DMV to investigations by the NHTSA.

 

The issues plaguing Tesla’s Autopilot and FSD are so extensive that we’ve created a comprehensive thread to keep track of them all. It’s important to highlight that Tesla continues to market FSD as a “beta” product, indicating it isn’t fully developed, yet they offer the feature as a pricey add-on for their current electric vehicle lineup. Owners effectively become test subjects for this evolving technology. Tesla does inform buyers that the system requires constant driver oversight and is not fully autonomous. Nevertheless, the company seems to be shifting the kind of testing that other manufacturers perform in controlled environments onto customers in everyday settings. AMCI’s results further illustrate the challenges facing Tesla and its FSD technology.

 

AMCI conducted their trials using a Tesla Model 3 equipped with FSD versions 12.5.1 and 12.5.3 across various settings: city streets, rural highways, mountain roads, and interstates. While they noted FSD’s reliance on camera technology was impressive, they pointed out that this approach is unique to Tesla among automakers, which often utilize a combination of cameras, sensors, radar, and lidar for better safety and accuracy. On average, however, the FSD software required human intervention every 13 miles to ensure safety.

 

David Stokols, CEO of AMCI Testing’s parent company, AMCI Global, highlighted the issue of public trust in self-driving technology stating, “When it comes to hands-free driving systems, there exists a bond of trust between the tech and the public. However, many are unaware of important caveats such as the need for monitoring. This lack of understanding can lead to complacency, which is risky.” His assessment emphasizes that, while the technology shows promise in certain conditions, it struggles to deliver consistently safe performance.

 

For those interested, the complete test results are available. Here’s a summary from AMCI:

  • Over 1,000 miles driven
  • Testing on city streets, two-lane highways, mountain roads, and freeways
  • Conducted day and night, under various lighting conditions
  • Utilized a 2024 Model 3 Performance with Hardware 4
  • FSD (Supervised) Profile Setting: Assertive
  • Demonstrated capabilities but revealed significant issues (and sometimes dangerously poor performance)
  • The tech’s confidence during complex driving situations can mislead users into viewing it as infallible, although it often fails to evaluate risks appropriately.

 

Even if 13 miles between driver interventions seems acceptable, it’s crucial to consider how these situations arise. AMCI made a particularly concerning note: “When mistakes happen, they can be sudden, dramatic, and perilous. During such moments, a driver, who isn’t engaged, might not respond swiftly enough to avert an accident—or even a fatal incident.”

 

To support its claims, AMCI shared three videos demonstrating instances of unsafe FSD behavior.

has not yet publicly addressed this report, though it seems unlikely that we should expect a response. The car manufacturer may lean on the argument that the software is still being refined. However, it is reasonable to think that offering a feature named FSD, which claims to have future self-driving capabilities, to ordinary users at this stage—especially when the system’s decisions can lead to serious issues—raises concerns. Testing from AMCI indicates that FSD frequently displays limitations.