US watchdog sticks probe into 2.6M Teslas over so-called Smart Summon crash reports
If it's spelled like an A.S.S, acts like an ass, maybe it's just...
The US National Highway Traffic Safety Administration (NHTSA) has launched a probe into Tesla's software that allows cars to operate autonomously over short distances, after reports of the code crashing in a physical sense.
The software under investigation is called “Smart Summon” and “Actually Smart Summon,” and is accessed via the Tesla app. It can be used to instruct some of the automaker’s vehicles to drive themselves to the location of a driver’s choosing. Tesla says the code can navigate around obstacles to reach the desired destination, though only over distances of up to 100 metres, and suggests it as a means to do things like automatically leave a tight parking space. You can see it in action below.
Tesla’s promises may be hollow: Multiple reports claim the self-driving machines have hit with obstacles and other vehicles. Twelve of these apparent accidents were by Teslas using the original Smart Summon software, while four more incidents have since been alleged by customers using its successor Actually Smart Summon, including one formal complaint being investigated by the NHTSA's Office of Defects Investigation (ODI).
"All four incidents involve the subject Tesla vehicles operating in Actually Smart Summon failing to detect posts or parked vehicles, resulting in a crash," the watchdog wrote [PDF] on Tuesday.
"ODI is aware of multiple crash allegations, involving both Smart Summon and Actually Smart Summon, where the user had too little reaction time to avoid a crash, either with the available line of sight or releasing the phone app button, which stops the vehicle’s movement," the NHTSA added.
- Tesla Cybertruck, a paragon of reliability, recalled again
- Cruise robotaxis parked forever, as GM decides it can't compete and wants to cut costs
- Judge again cans Musk's record-setting $56B Tesla package
- Elon Musk's galactic ego sows chaos in European politics
Notably, the agency said it had received no reports of these alleged incidents from Tesla, which is required to disclose any accident involving a car using self-driving software on a public road. The probe affects every Tesla Model S, 3, and X purchased since 2016, potentially covering 2,585,000 or so vehicles.
Smart Summon was widely panned after its launch in 2019 as being buggy and prone to either not working at all or making cars bump into objects – thin trees seemed to be a particular problem. Actually Smart Summon debuted in September 2024 and, while sentiment on Tesla forums suggests it is an improvement, issues remain.
"Actually Smart Summon and Full Self-Driving are defective engineering prototypes, and should not be allowed on the road," said long-term Tesla critic Dan O'Dowd, the billionaire founder of the Dawn Project, in a statement sent to The Register, referring to FSD, Tesla's more general-purpose self-driving software.
"Tesla has repeatedly shown its contempt for regulators and it is concerning that Tesla failed to report any of the 15 accidents NHTSA identified Actually Smart Summon to be involved in to federal regulators,” he claimed.
"Tesla claims its software is safe but, just like with its Full Self-Driving software, it has been under-reporting the number of crashes that its software has been involved in and then used this incomplete data to paint a false picture of the software’s safety."
The NHTSA and Tesla regularly cross swords over the regulation of self-driving cars, and the automaker is subject to multiple investigations by the agency.
Tesla appears to be silent on the matter at the time of writing, and given how close its CEO Elon Musk is now to incoming US president Donald Trump, it may have to do simply nothing at all. ®