Another investigation into 2.6 million Tesla vehicles begs the question: is Tesla pushing the boundaries of innovation, or going too far?
Electric vehicles are likely to be the future of motoring. Similarly, autonomous vehicles will likely play a big role on our roads in the years to come. One company continues to push the boundaries of innovation in both areas: Tesla.
However, the electric vehicle manufacturer has suffered a setback. The US National Highway Traffic Safety Administration (NHTSA) has launched another probe into the company, this time over its Smart Summon and Actually Smart Summon features.
Both features enable Tesla owners to ask their vehicles to autonomously arrive at their location from wherever they’re parked, provided it’s within 100 meters of where they are. The principle behind the feature is that you don’t need to go to your car when leaving your home or a restaurant – your car can come to you.
Investigation opened
More than a dozen incidents have been reported to the NHTSA between the two features, resulting in crashes where the system has in some way malfunctioned.
“ODI [a subsection of the NHTSA] is aware of multiple crash allegations, involving both Smart Summon and Actually Smart Summon, where the user had too little reaction time to avoid a crash, either with the available line of sight or releasing the phone app button, which stops the vehicle’s movement,” the report reads.
The initial Smart Summon feature has been available for Tesla owners over the last five years, while Actually Smart Summon is only a few months old.
“It’s unsurprising that NHTSA is investigating Actually Smart Summon, given the litany of safety defects and crashes involving Tesla’s self-driving technology,” says Dan O’Dowd, the founder of The Dawn Project, a safety advocacy group.
"Actually Smart Summon and Full Self-Driving are defective engineering prototypes, and should not be allowed on the road,” claims O’Dowd. However, no injuries were caused by any of the incidents being investigated by the NHTSA.
Pushing the boundaries?
The key question is whether the latest probe – one of many the company has faced over the years – is an indication of something wrong at the company (Tesla did not immediately respond to a request for comment) or if it’s simply a common likelihood given that Tesla is operating on the cusp of innovation.
Autonomous vehicles of the type that Tesla is developing are always likely to have some rate of error, even if they’re only operated autonomously for a short distance as in the case of the Smart Summon features. But with unpredictable road conditions and plenty of potential issues arising that such systems have to navigate, it seems likely that such issues could arise.
Tesla’s self-driving systems have operated under human oversight on roads for several years – meaning that the driver can and should quickly intervene if they see the automated system misfiring. This is something that vehicles being summoned cannot do because there’s no one behind the wheel.
However, the data suggests that Tesla vehicles are involved in more accidents than other vehicles, with a fatal accident rate twice the level of the average, according to a November 2024 study. Tesla vehicles cause fatal accidents 5.6 times every billion miles driven by the vehicles, compared to 2.8 per billion miles across all other vehicles.
For companies operating on the edge of innovative technology, some might suggest that’s part and parcel of making breakthroughs. But such incidents are happening on real roads, to real people – and are getting real attention from regulators.
Your email address will not be published. Required fields are markedmarked