Account Info
Log Out
English
Back
Log in to access Online Inquiry
Back to the Top
Tesla's "Cybercab" global debut sends stock falling
Views 1.4M Contents 540

Tesla's 'Full Self-Driving' Can Only Travel 13 Miles Without A Driver Stepping In. AMCI Released More Videos.

For years, Tesla has proudly paraded its advanced driver assistance system, Full Self-Driving as being the real deal. It's claimed the system can navigate traffic, manage highway driving and repeatedly claimed it's the future of driving, despite the number of crashes, collisions and even deaths linked to the system mounting. Now, a new study has looked into just how far the system can actually drive before needing assistance from a human, and it's only 13 miles.
Automotive research company AMCI Testing wanted to find out just where the limits of Full Self-Driving lay, so it set out to cover more than 1,000 miles on the streets of California, reports Ars Technica. While undertaking the driving, its researchers had to step in and take the wheel from the Tesla system more than 75 times.
Safety drivers riding in the Full Self-Drive equipped Teslas had to take control of the car almost every 13 miles, reports Ars Technica, due to run-ins with red lights and, in some instances, cars coming in the other direction.
Those shortcomings with Autopilot and FSD have been well-documented, with owners reporting that their Teslas have failed to recognize everything from rail crossings to parked police cars. In some instances, the issues FSD has when it comes to recognizing obstacles and hazards in the road has led to crashes.
Tesla hasn't been so forthcoming with the frequency with which actual Tesla owners step in and take control of their Tesla.
AMCI Testing continues its extensive, 1000-mile evaluation of the Tesla Full Self Driving (Supervised) system in advance of the company's Oct 10th Robotaxi reveal event. Six newly filmed driving scenarios are available at the link below. As with previous videos, these continue to show that, although FSD can often drive the car competently for limited distances across a wide range of scenarios, the mistakes it does make continue to put occupants and the public at significant risk.
Beyond FSD's actual performance in the instances shown, these potentially dangerous driving errors demonstrate the incontrovertible need for regulation and agreed upon metrics for comparable, system-to-system evaluation. One of the key metrics to define across all types of autonomous and semi-autonomous systems is what constitutes an "intervention" – which system actions warrant one, and how they should be scored by regulators whenever one occurs.
AMCI Testing's protocol requires an intervention (the driver taking control and forcing disengagement of the system) whenever FSD's actions put the occupants, the public or other motorists at risk. As stated in our previous release, FSD's behavior required 75 interventions in 1000 miles of real-world testing, for an average forced system disengagement rate of 1 every 13 miles.
"The key consideration is, currently can any hands-free system be ethically operated by consumers on public roads. To advance the safety of the industry, AMCI Testing has now articulated a standard. But really that is the question- should we or the public be generating a standard or is this the responsibility of a Federal or State Regulator to arrive at an "intervention" standard? asked David Stokols, CEO of parent company, AMCI Global. "Further, if there are too many incidents, as we have seen in AMCI Testing's results then the public will lose confidence in all FSD and Robotaxi-type software solutions from any OEM."
We have found FSD's evolving programming and unexpected changes between software versions as proof of the critical need for more specific regulation and oversight. The obvious example is when "Autoland" was being developed for airliners in the mid-1970s, designed to allow zero-visibility operation with many fewer variables than occur on a public road. Certification required a failure probability per occurrence of less than 1 in 150,000.
Arguably, the 1 in 150,000 goal is what we should be aiming for in road-based autonomous systems. "Extrapolate the failure rate AMCI testing experienced in only 1000 miles of driving with FSD (Supervised) and you can see the Tesla system is nowhere near that mark. Additionally, FSD does not appear to be on a progressive, problem-solving track. There are inexplicable performance regressions that sometimes occur as the software updates," said Guy Mangiamele, Director of AMCI Testing.
AMCI Testing has dropped the next 3 in the series of test-videos intended to demonstrate the complex issues of trust and performance that FSD continues to pose to drivers and the public. Please follow the link for all the test-videos released to date at https://amcitesting.com/tesla-fsd/
Disclaimer: Community is offered by Moomoo Technologies Inc. and is for educational purposes only. Read more
6
1
+0
14
Translate
Report
90K Views
Comment
Sign in to post a comment

View more comments...