Tesla released its latest Vehicle Safety Report, and it shows a slight improvement in the figures recorded for its Autopilot driving assist technology. Here’s the latest update:
In the 3rd quarter, we registered one accident for every 4.59 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 2.42 million miles driven. For those driving without Autopilot and without our active safety features, we registered one accident for every 1.79 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 479,000 miles.
By way of comparison, the previous quarter saw one accident for every 4.53 million miles driven with Autopilot. That’s good, and a seemingly impressive performance that, on the surface, indicates the cars crash less when Autopilot is engaged. There’s a lot of useful information missing, such as the stretches where drivers are most likely to engage the driving assistant, or how many drivers turn the system off ahead of areas that they suspect they’ll need to take over.
It’s also important to note that this information is for Autopilot miles from the previous quarter, so this is not a reflection of Tesla’s recently launched Full Self Driving beta rollout. There’s an interesting potential parallel that we think will probably play out over the course of the coming months and years, though. Tesla’s software is designed to “learn” as it goes, meaning it should improve with more vehicles on the road and more interaction from human drivers. Based on Tesla’s numbers, that seems to have proven true with Autopilot, and we hope it remains true as more features are added to the system.
Meanwhile, a story put together by Stef Schrader at The Drive chronicles a series of driving fails made by Tesla’s Full Self Driving technology, released to a limited number of owners in beta form. If you’re not familiar with the term, beta means it’s not a finished, fully polished version of the technology. As we’ve reported before, Tesla owners are required to accept terms and conditions that include a disclaimer that the system “may do the wrong thing at the worst time.” But — and this is one massive “but” — even if a Tesla owner is willing to consent to take part in the beta, the other drivers that Tesla owner is sharing the road with were not offered the same choice. And that’s a big problem.
🚨 Omar Qazi nearly crashes his #Tesla while using “Full Self Driving” beta software. None of the other cars consented to his experiment. $TSLA $TSLAQ pic.twitter.com/uU2RT9l5ZI
— Greta Musk (@GretaMusk) October 25, 2020
🚨🚨🚨
Take 41 seconds and watch this video.
OUTRAGEOUS!!H/T @StultusVox
cc @Tweetermeyer @PAVECampaign @AlexRoy144 $TSLAQ pic.twitter.com/4neqQxJPwr— TC (@TESLAcharts) October 25, 2020
Not surprisingly, the National Highway Traffic Safety Administration says it’s monitoring Tesla’s beta rollout “and will not hesitate to take action to protect the public against unreasonable risks to safety.”