Oi, Elon: You Musk sort out your Autopilot! Tesla loyalists tell of code crashes, near-misses
Carmaker's unpredictable 'super cruise control' tech blamed for ton of close calls
Updated Tesla CEO Elon Musk asked the Tesla owners among his millions of Twitter followers last week what aspect of their electric cars they'd most like to see improved or fixed.
Among the 24,000 or so replied, there's a fair amount of concern about Autopilot, the assistive driving software in Tesla Model S cars.
The first reply came from a Twitter user identified as Mike Leonardi, who wrote, "Autopilot lane changes in traffic really need help!"
Musk replied that's a top priority for Tesla and wrote off the software's shortcomings to an excess of caution.
But Tesla's not been so cautious that it withheld the software from the public. Rather in early October, it introduced Software Version 9.0 for its Model S, Model X, and Model 3 cars, and its mobile app.
The update includes support for a new Autopilot feature called Navigate on Autopilot, "an active guidance feature that, with driver supervision, guides a car from a highway’s on-ramp to off-ramp, including suggesting lane changes, navigating highway interchanges and taking exits."
These new capabilities have been slowly rolling out to Tesla owners, but there's concern Autopilot doesn't work very well.
The car biz has plenty of ardent fans who love the idea of beta testing buggy code at high speeds and reflexively characterize critics as trolls or short sellers of Tesla stock. There are of course people who highlight Autopilot problems with an eye toward investment, as can be seen from this tweet.
But there are also customers who worry the technology isn't ready and isn't safe, without an ulterior motive.
Effusive reviews of the latest Autopilot update can be found, as can less positive ones, such as a detailed critique posted to the Tesla Motors Club forum earlier this month that notes Navigate on Autopilot "tries to kill you any time a lane ends."
Twitter user @trumpery45, posting under the name Justin, gathered a collection of replies to the Tesla's leader's request for fix suggestions in his Twitter feed. The Register asked Justin whether we could attribute his observations to a full name but he expressed reticence, citing the potential for harassment by Tesla fanatics.
@elonmusk Hi Elon, I am spirited how bad is Tesla customer service, it has been more than 12 hr that I left my number for a call back, was on a freeway yesterday night and my new model3 Autopilot and front radar stopped functioning Worried, pls help pic.twitter.com/lJUC9T8mWV— Atul P (@AtparT) November 11, 2018
Autopilot likes to swerve into the tail end of a merge instead of staying with the center line. This behavior is good at speeds less 20 mh but bad at 50+— William K Spiller (@wkspiller) November 11, 2018
@elonmusk Could Autopilot please stop disabling itself arbitrarily, and not require a full stop to re-engage? Three days driving upstate, it has happened every time. I’m paying attention, it’s not a user error 🤕— Michael Heilemann (@Heilemann) November 3, 2018
On route 47 in Yorkville, IL when using navigation my Model 3 center console crashes repeatedly. It’s only in that small geographical area. Started with V9 and still happening with autopilot on nav update— Brandon Bernicky (@brandonbernicky) November 9, 2018
Phantom braking while in autopilot— Mark Svendsen (@marksvend) November 9, 2018
Via Twitter DM, he explained that as Tesla's ambitions for Autopilot have increased, the gap between hype and reality has become more obvious.
"It is scary to think the intention is to give the car the ability to initiate lane changes and navigate off ramps and on ramps and merges when it has such a dim model of what’s going on at any split second," he said.
The litany of Autopilot woes in Justin's collection describe software that crashes and turns itself off arbitrarily, lack of cross-traffic detection, collisions with off ramp barriers and curbs, radar failures, unexpected swerving, tailgating, ghost braking for overpasses, speed limit database errors, and uneven speed changes, among other ills.
On October 30, Tesla was sued in a Florida state court over claims that the car company "duped consumers into believing its Autopilot add-on feature can safely transport passengers at high speeds with minimal input and oversight from those passengers."
Tesla has maintained that its Autopilot system should only be used when the driver is driving with hands on the wheel. The company did not respond to a request for comment. ®
Updated to add
After this story was published, a Tesla spokesperson pointed to the company’s Q3 safety report and offered this statement about the Autopilot lawsuit:
We don’t like hearing about any accidents in our cars, and we are hopeful that those involved in this incident are recovering. In this case, the car was incapable of transmitting log data to our servers, which has prevented us from reviewing the vehicle’s data from the accident. However, we have no reason to believe that Autopilot malfunctioned or operated other than as designed.
When using Autopilot, it is the driver’s responsibility to remain attentive to their surroundings and in control of the vehicle at all times. Tesla has always been clear that Autopilot doesn’t make the car impervious to all accidents, and Tesla goes to great lengths to provide clear instructions about what Autopilot is and is not, including by offering driver instructions when owners test drive and take delivery of their car, before drivers enable Autopilot and every single time they use Autopilot, as well as through the Owner’s Manual and Release Notes for software updates.
Sponsored: Beyond the Data Frontier